High fibre diet required

The government has identified the lack of such infrastructure, as constraining the development and introduction of advanced technology and business applications and, therefore, as a ‘critical issue’ in achieving economic transformation. Cabinet Policy Committee, 21 Aug 2006.[1]

New Zealand was still tinkering with the engine of the old economy and trying to modify yesterday’s technology to cope with demands it wasn’t built for. While more visionary nations, were reaping the economic and social benefits of multi-gigabit light speed communications.

More affordable, future-proofed, fibre-optic infrastructure was driving huge global advances in collaborative research and development and e-commerce. Our science, technology, and academic community – already late to the game with its first generation advanced network – was being lapped by other nations onto third and fourth generation connectivity. Meanwhile a large ‘under reconstruction’ sign was up over the Internet as we knew it, signalling further changes and challenges ahead.

While the government seemed happy to re-regulate and threaten telecommunications providers if they didn’t lift their game, it had done precious little to aggregate existing state-owned networks, make direct investment, or assume the kind of leadership necessary to boost our information superhighway capabilities.

Internet visionary Simon Riley warned that New Zealand’s lack of investment in next generation infrastructure was symptomatic of a larger policy failure, while similar sized nations were treating investment in next generation fibre-based telecommunications as an arms race. He urged the government to take a close look at the kinds of ICT strategies being embraced in Singapore, Korea, Japan, Taiwan, and Hong Kong, which had gone way beyond infrastructure.

“They are on to the next generation of applications and if we’re not careful the advances happening there will see us slipping into near-developing-world status.” Canada, the United Kingdom, and Holland were on version 5.0 of their respective advanced networks while New Zealand was still struggling with version 1.0, which was no longer advanced by world standards.

Japan’s next-generation broadband strategy planned to eliminate zero-broadband cities and towns by 2008. It already had 95 percent coverage with 80 percent of the population on ultra-high speed, and by 2010 expected to have 100 percent broadband coverage with 90 percent of households having access to bidirectional 200Mbit/sec speeds.[2] The resulting plunge in DSL deployment had a big impact on carriers who were focusing on copper enhancement technology, with a growing number of households now disconnecting in favour of fibre to the home.[3]

South Korea had unveiled a grand vision for its IT industry in 2005, designed to raise the level of per capita income by 2010. As part of its U-Korea (Ubiquitous Korea) plan, the government was backing next generation growth projects, envisioning a future where people had uninterrupted Internet access via fixed lines or mobile networks, any time, anywhere.[4] This involved a wireless broadband portable Internet service, a broadband convergence network, a digital mobile broadcasting service, a home network service including interactive TV, video-on-demand and e-health and e-learning services, a digital terrestrial HDTV service, and a planned transition to IPv6.

There had been talk of moving to the next generation of the IP for a decade and it would soon become a major issue if the Internet was to expand in the way the innovators and engineers knew it must. The theory was that almost everything that could communicate soon would, including cars, microwaves, fridges, wristwatches, cellphones, and much more.

That would ultimately require upgrading the IP transport layer to provide more addresses and greater flexibility. The network software layer also needed optimising so machines could have more complex conversations with each other. The European Telecommunications Standards Institute (ETSI) and the IPv6 Forum had joined forces in the late ’90s to promote the next generation of Internet protocol numbering, developed by the IETF:

It had been extravagantly predicted that the four billion addresses available under IPv4, the existing 20-year-old 32-bit system would be gobbled up by 2005. There was an urgent need to move from this increasingly fragile environment to the 128-bit IPv6 addressing scheme, extending capacity to 340 billion billion billion billion addresses to meet the largely unimagined requirements of the distant future. In 2001 IPv6 lead designer Steve Deering warned things would only get worse if there were further delays. Many opportunities had already been lost because existing network address translators (NATs) only worked with certain types of applications. IP telephony and peer-to-peer gaming, for example, wouldn’t work through network translators.

Cisco, Intel, 3Com, Ericsson, Hitachi, Nortel, and others were rapidly upgrading their equipment for IPv6 compliance for the new world of IP everywhere. In Yokohama, Japan, taxis, buses, and delivery trucks were continuously monitoring road conditions, speed, and weather in an IPv6 project called ‘real-space’ networking. Toshiba had been using IPv6 for its smart kitchen project, giving every appliance an IP address for instant maintenance and monitoring. Nokia was demonstrating its Mobile IPv6 technology for location-based directory and ‘always-on’ services, and Sony was preparing all types of media, including broadcasting, to be Internet-enabled using IPv6. In the future all Sony products would have an IP address.[5]

We were also reminded of how enthusiasm can get ahead of itself when in 2007 a revised estimate was made of how we were holding out under the present IPv4 system. We hadn’t run out but some people were concerned. In fact New Zealand had made very little progress in implementing IPv6, although the .nz Registry Services division of InternetNZ had upgraded its dot.nz name servers – located in Wellington and Albany, north of Auckland – to IPv6 connectivity. The new KAREN, the fulfilment of a dream set in motion by the country’s Internet pioneers 20 years previously, was also running IPv6.

Forester Research had written a premature obituary for the Web as we knew it in 2001, claiming it would be supplanted by the X Internet (X = executable) by 2005. Downloadable code, enabling more entertaining and engaging experiences with on-line services, and the proliferation of smart Internet-enabled devices would reshape its role, it claimed. Real-time information on ‘everything from anywhere’ would enable businesses to have better control of their enterprise. The market for Internet devices and services was US$600 billion annually, and by 2010 it was expected to have rocketed to more than US$2.7 trillion worldwide, with 14 billion devices on-line.

Then as visions of the Internet of the future were spinning in our heads, news of the next generation research, science, and academic network reached us. Work had begun in 1996 and by the end of 2001 Internet2, the super-high-speed network being developed by 180 US universities in conjunction with large corporations and research networks in Canada, Europe, and Australia, was raising the bar. The next step development being funded by the NSF used protocols and middleware to support applications only recently dreamed of. The ‘Abilene’ Internet2 backbone operated at up to 2.4 Gbit/sec for very large file transfers, such as beaming holograms into an auditorium in real time, or telemedicine, where 3D images of organs could be sent to doctors around the world. At this stage New Zealand hadn’t even begun to consider whether it had a place in the advanced networking community.

Calling Noo Zeeland?

In the late ’90s Simon Riley and his partners shut down Pronet, a nationwide wholesale bandwidth provider, after a prolonged law suit with Clear Communications, whose network had been so unreliable it had cost them their business. Riley, who had been a public policy analyst in Canada, went back to the advertising industry for about 18 months, before becoming involved in policy work regarding the future of the Internet in New Zealand.

He’d been asking around in government circles why the country didn’t have an advanced science and technology network. “I had seen all these policy papers coming through but nobody was making any references to advanced networking here, even though there was plenty of activity offshore. I tried to find out who would be responsible for such a thing and nobody appeared to be even interested.”

Riley couldn’t believe that a country talking about becoming a knowledge economy had no aspirations to develop a nationwide high-speed research network, so he dived headlong into projects that gave him leverage to begin lobbying for one. He contributed to Howard Frederick’s ‘Knowledge Economy Report,’ as well as conducting research, subcontracting, consulting, and running courses in the e-commerce arena. At an event at the eVision centre in Wellington in May 2001 Riley found himself in the company of many of the original Tuianet members who had brought the Internet to New Zealand, including John Houlker, John Hine, and Neil James. There and then he arranged a meeting of minds with a view to raising awareness about the need for a gigabit backbone. The group organised an open meeting with academic and industry luminaries as a call to action for a national science and research network.

The following week, by coincidence, he found himself elected to the council of InternetNZ. His first official business was to propose and form the affiliated but ‘arm’s-length’ Internet2 Committee, which included the core Tuianet crew and others advocating for a gigabit network. NZTE gave the group $50,000 to deliver a capability study with support from InternetNZ, Industry New Zealand, The University of Otago, and Cisco Systems. It commissioned Laurence Zwimpfer to research and compile Collaborating at Speed, published in October 2002:

Zwimpfer investigated the administrative structure and funding required from various partners to get things moving, believing a business case could be ready by the end of 2002 and contracts let in May 2003, with the network operational four months later. “It is no longer a question of whether New Zealand needs an NGI; the challenge now is to see how quickly we can establish one.” While Deputy Prime Minister Jim Anderton had enthusiastically endorsed the report, “Access to next generation Internet is crucial to fulfilling the government’s aims of encouraging world-class innovation and strengthening global connections,” the response from other ministers and government officials, including Communications Minister Paul Swain, was negligible.

With the help of InternetNZ the pioneering group formed Next Generation Internet New Zealand (NGN-NZ) as a non-profit society, with Dr Neil James, assistant director of information services at the University of Otago, as chairman. He warned that New Zealand’s information technology industry would suffer unless we forged links with Internet2, claiming the nation’s research community was already hindered by a lack of affordable high-speed bandwidth.[6]

The new network would be based on the leading-edge IPv6 protocol, which would make more effective use of greater bandwidth, allow users to send and receive time-dependent files such as voice and video in a more reliable way, and use technologies beyond those currently available on the commodity Internet. Rather than just a TV set with a picture of a head at each end it could provide a whole wall of video or 3D images that could be manipulated between distant groups.

Virtual meeting places

In the past universities often sent individuals overseas on sabbatical for several months to work on projects. This network would provide extra ‘human bandwidth’ including body language, so people felt more comfortable working at a distance. As well as more media-rich video-conferences or 3D work-groups, there were likely to be many commercial spin-offs for the development of leading-edge applications. Otago University had a good reputation for spinning-off businesses, including Australia’s leading computer graphics production house Animation Research (ARL), which went on to develop graphics for the America’s Cup and sports events worldwide.[7]

Most of the Tuianet crew were on board NGN-NZ, and Riley was secretary. They hired Tone Borren as chief executive, and were doggedly determined that with or without overt government support they were going to build a network. Of course the first port of call was to get the country’s universities to subscribe, then half the Crown Research Institutes and the National Library. It was déjà vu, with all the old partners in the original Internet backbone paired up again and ready for a next generation, speedier, more robust, backbone that would liberate them from the grip of costly per-megabyte commercial bandwidth, and take them to gigabit heaven.

The new generation fibre-optic network would enable secure, high-end computing, capable of moving large amounts of information between databases for climate modelling, creating video walls, or manipulating 3D images between cities, or even across the world. Chairman Neil James conceded that for financial reasons a full national network might not be possible initially, but some elements were expected to be live late in 2003. “The technical side is not difficult, it’s more the business and political relationships – once these are sorted things could happen quickly.”

NGN-NZ’s dissatisfaction with the current telecomms model for bandwidth charging remained a concern. Prices were not based on the cost of provision; if you operated at 1Gbit/sec and suddenly required 2Gbit/sec that was likely to double the cost, but under the proposed model, once the fibre was available, extra bandwidth would come at a small marginal cost. Virtually every other OECD nation had advanced networks. Higher learning institutions in the United States, United Kingdom, Canada, and Australia were already logged on to the Internet2 network. “If we don’t exploit the capabilities of this technology we will be left out in the cold as far as international research is concerned. More and more research is done in collaboration and advanced networking is required,” warned James.[8]

A decision was made to go to the market with a request for information to see what bandwidth providers and network equipment vendors were prepared to offer. Telecom and CityLink were on the shortlist and a decision was imminent. It was thought the Tranz Rail-owned fibre-optic cable between Wellington and Palmerston North would provide an important building block. The fundamental design would be based around Gigapops in the main centres including Auckland, Hamilton, Palmerston North, Wellington, Christchurch, and Dunedin. These neutral open connection ‘points of presence,’ based on the leading-edge IPv6 protocol, would be capable of gigabits per second throughput. While the government had shown strong interest in the network it had also made it clear most of the funding had to come from private initiatives.

“We were down to the final selection from the tender documents and weeks away from signing contracts, when the government decided to step in.” As a result of the 2003 Knowledge Wave gathering, keynote speaker Rita Caldwell, then director of the US NSF, had put a challenge to the government. She asked how New Zealand expected to continue working with all the nations that had advanced networks, when it didn’t have anything even on the drawing board. She met with Prime Minister Helen Clark and Science Minister Pete Hodgson and warned them our whole science and research community was in peril if it wasn’t connected to the rest of the world. Hodgson finally realised he needed to do something and the government agreed to dig into its coffers.

Shamed into action

The government injected $200,000 to build a business case for a Gbit/sec network, across national research institutions and out to an international link. A six-person steering committee was established, headed by the MoRST, and including the Economic Development Ministry, Trade and Enterprise, the Ministry of Education, Treasury and the Tertiary Education Commission. MoRST general manager of strategic development, Andrew Kibblewhite, said the business case would pull together details of likely demand from the research, education and innovation sectors, and evaluate the best ways of providing and funding a network. The group would co-operate with NGI-NZ, which had 13 members – primarily universities and Crown Research Institutes – each of which had paid $15,000 to join. It was hoped work might begin early in 2004.[9]

Simon Riley and those left on the NGI-NZ committee kept faith, even though their ideas had largely been assimilated into the government’s big idea. They believed things would now move along rapidly, as they’d essentially done the groundwork. Inevitably a number of the NGI-NZ committee members, because of their expertise and ongoing lobbying, ended up on the advisory board for the new network.

The steering group research leaned heavily on the work already done by NGI-NZ and reported back that that New Zealand’s researchers were indeed being treated with a ‘less than enthusiastic response’ from international players, once they discovered it didn’t have an advanced network. If New Zealand did not provide international links to other advanced networks, our scientists “will be on the back foot, and falling behind fast.” Without a comparable network it would simply be impossible to participate in international projects that had spin-off economic and social benefits.[10]

NGI-NZ was commissioned by the MoRST to manage an interim $8 million capability development fund to start training people and running practical demonstrations of a temporary network of access grids, set up with Wellington’s CityLink and the universities, which Riley called NGI-Lite. The government then formed ‘the committee from hell,’ which dragged on for about two years to go through all their processes.

By mid-2004 NGI-NZ chief executive Tone Borren expressed his frustrations. He’d had enough and was stepping down after 18 months of lobbying. He slammed New Zealand in his parting speech for failing to take advantage of its existing fibre capacity. Computer speed had doubled every 18 months for the past 15–20 years, data storage kept doubling and optical fibre capacity had doubled every 9 months. “Networking capability over fibre-optics has grown beyond the capability of computers and we didn’t get to see any of that.”

While Cabinet had agreed to support the proposed gigabit network linking 127 sites throughout the country in partnership with MoRST, Borren warned we were still way behind most of the world. For example, bioengineering professor Peter Hunter at Auckland University had developed mechanical models of the human heartbeat and needed to work with experts offshore as part of a virtual organisation. The only thing holding him back was access to the network. New Zealand needed to take part in weather, humidity, and earthquake monitoring and multi-band videoconferencing.

“An observatory in the US wanted to upgrade the Mt St John observatory to take part in an international astronomy experiment which uses 12 different wavebands, collecting 200 terabytes of data nightly, but we don’t yet have network to connect into that.” While research networks in the United Kingdom and Canada were now capable of handling terabytes per second and the rest of the world was connected at Gbit/sec, even Costa Rica and Fiji were joining the 48 other nations, while New Zealand hadn’t even begun work on its next generation Internet.[11]

Then something finally happened. Crown-owned Research and Education Advanced Network of New Zealand (REANNZ) was formed as the operating company for what was to become the KAREN. The network was officially commissioned on 15 December 2005. Meanwhile NGI-NZ stayed in holding pattern with Riley taking over the CEO role. “We were still trying to decide what we should do with it, shut it down or try and morph it into something else.” Within a year it was mothballed. “MORST remained in denial about NGI-NZ because they finally recognised what we had been doing and I guess they didn’t want someone asking, ‘Why didn’t you do this five years earlier?’”

Tuianet MKII

When the nation’s universities and research institutions were pulling together the threads of what was to become New Zealand’s first nationwide Internet backbone, there was a grand goal in mind: a national science, research, and education network. Those grey beard visionaries achieved their goal without government funding, but shortly afterwards Telecom and Clear cherry-picked the various institutions with tempting deals and the collaboration collapsed. Now, exactly 17 years after the first Internet connection into the US Internet backbone was lit up at Waikato University, the original vision was back on track.

In April 2006 TelstraClear and its subsidiary Sytec won the $43 million government contract to build and operate the KAREN network for REANNZ and run it for four years. Research institutions and academia would match the contract figure for further development, and gain access to 10Gbit/sec on TelstraClear’s national fibre-optic network, running on a separate wavelength to its commercial traffic. Network routers and switches would be operated by REANNZ but managed by TelstraClear. After five years the network was expected to be self-sustaining, with operating costs met through member subscriptions.

Chief executive Charles Jarvie said there was never a mandate from the government to build any infrastructure, only to facilitate connectivity throughout New Zealand and the international research community. Members would connect for standard 1Gbit/sec links through points of presence at CRIs, universities, and ISPs. Smaller members would only take 100Mbit/s circuits. The first link was between AgResearch in Dunedin with Crop & Food in Invercargill.[12]

TelstraClear subcontracted Juniper Networks, the leading provider of broadband routing in New Zealand, to deploy its M-series multi-service routing platform, to direct traffic across the backbone, and handle the extremely high-end computational processes and information sharing on the IPv6 network. Computerworld journalist Stephen Bell asked REANNZ executive Charles Jarvie in September 2006 how it was that New Zealand had got so far behind the 40 other nations that already had advanced networks for science and education. Jarvie gave the Reader’s Digest politically correct, condensed version: “The stumble, Jarvie says, came in the mid-1990s, when the Internet became a popular medium and the core infrastructure was turned over from the universities and research establishments, which had run it, to the major telcos. In other countries, the universities went straight on to build their own special-purpose higher-capacity networks. In New Zealand, it fell off the radar for 10 years, and educational and scientific organisations were content to buy capacity from the telcos.”[13]

KAREN went live in mid-December 2006. The high-speed virtual communications link between tertiary institutions, research organisations, libraries, schools, and museums was 10,000 times faster than most domestic broadband connections. Its 18 founding members, the universities, CRIs, and the National Library, had signed up for three-year membership agreements. It soon had 16 members with POPs around the country connected to 28 sites, and peering agreements through 13 other national research and education networks (NRENs). It had two international connections, one to Sydney and one to Seattle. Through those international links it could reach 40 other NRENs and the institutions connected to them.

CRIs, including IRL, undertook a large amount of research for industry ranging from applied mathematics to new materials, imaging and sonar systems, and communication systems, such as optical switching fibre systems and wireless networking. They supported many national industrial partners and collaborated with universities. For example, IRL was actively engaged in wireless research with Auckland and Canterbury, and to a lesser extent with Waikato (WAND group) and Victoria University. Such research between universities and industry was supported by a number of companies including MediaLab, a successful ICT research aggregator. KAREN was exactly what they’d been waiting for to enhance their collaboration.

MediaLab functioned as a kind of mediator between high-level ICT research and practical industry applications. Or, as chief technical officer Peter Chappell put it, “matching big brains to real-world problems.” It worked with universities, government, and private industry researching critical areas of ICT development. MediaLab was engaged in a project with Auckland University and Korea’s Electronics and Telecommunications Research Institute (ETRI) in the development of cutting-edge “FTTX – or Fibre to the X – where the X is a premise, node or screen.” New Zealand trials were being considered of the WPON equipment, which passively splits multiple wavelengths of light that travel down fibre. This would radically reduce and possibly even eliminate the need for electronics inside a roadside cabinet for delivery of high speed broadband from the node to the home or business.[14]

KAREN chief executive Donald Clark said one of the most popular early uses of the network was videoconferencing. Research and development service provider Scion had staged a five-hour video-conference between Rotorua and the United States to further research collaboration, and participants had commented on the reliability and quality of the connection. AUT had launched its new access grid suite, allowing staff and students to engage with colleagues in New Zealand and around the world. The University of Auckland had used a portable access grid to connect with colleagues from Oxford University in England on mathematics education.[15]

The National Library of New Zealand signed an agreement to connect to KAREN in May. Chief executive Penny Carnaby said the all-you-can-eat broadband capability would enable it, and potentially all the libraries of New Zealand to bring global knowledge networks to every New Zealander and showcase New Zealand digital content and stories to the world. “Our customers want access to Web-based multimedia experience, with rich content, video, sound, and high quality pictures, KAREN provides the capacity to deliver this.”[16]

Roadmap for research

A roadmap through to 2009 was part of the output agreement between MoRST and REANNZ. The 2007 document outlined the skills and competencies within the KAREN community and sought to identify further professional development needs, along with the support structures, relationships, policies, and funding mechanisms to help expand the community of users. It needed to identify, develop, and deploy technical capabilities including hardware, middleware, and software applications. Collaboration was pivotal, both in terms of existing arrangements and how future local and international relationships might develop.

It took a broad perspective about how KAREN might embrace biotechnology, ICT, the creative industries, design and screen production, food and beverage, social sciences, and cultural areas. It would look at existing and new centres of research excellence, international research and partnerships, and how to develop valuable collaborative endeavours through the use of high-performance computing, access grids, videoconferencing, virtual research environments, visualisation, knowledge management tools, data mining, multimedia learning, and research resources. It would attempt to gain remote access to scientific sensors, telescopes, microscopes, and other valuable tools. The roadmap’s ultimate goal was to improve the outcomes for the education and research sectors using KAREN, and develop related infrastructure and services.[17]

By 2008–2009 KAREN planned on adding 100 schools with five connections and eight content or service partners with a connection each. In 2010 it planned to add a further 200 more schools with 200 connections and 20 content or service partners with a connection each. “As more peering relationships are put in place, the network reach would extend, and more overseas network members would be aware of New Zealand’s availability to participate in new research and education collaborations,” said the REANNZ statement of intent.

KAREN was able to get to more than 90 percent of the North American and European routes, and 33 percent of the potential peering partners at the Pacific North-West Gigapop in Seattle. “It is unlikely that we will ever secure our target sector income from schools and libraries by approaching each institution separately (2600 schools and 260 libraries!). It is critical that planning for these sector networks has KAREN at the backbone and REANNZ as a trusted advisor.”[18]

In his first annual report, REANNZ chairman Jim Watson21 applauded the attempt to finally deliver an advanced science and education network, but said it was imperative the momentum was maintained:

Meanwhile the research community continued to come up with new uses for the advanced network. New Zealand’s major universities were using an access grid to combine multiple video streams from different sites into a single collaborative space. Nathan Gardiner, co-ordinator for the New Zealand Access Grid and IT manager of the Human Interface Technology Laboratory (HIT Lab) at Canterbury University, said the open source research tool used multicast traffic on the KAREN research network to enable people in different places to meet in a ‘virtual venue.’

It worked in a similar way to a chat room. You logged on and inside the venue server were virtual rooms. You could create your own virtual room or meet someone in a particular type of room. Multiple meetings could occur concurrently across the country. In 2006 Gardner had participated in an international meeting where 100 different sites were linked in. The Access Grid also provided interfaces to grid middleware, enabling the creation of new tools for collaborative visualisation, data-sharing, remote control of instruments, and interaction with other grid resources. “It’s like videoconferencing on steroids … Not only does it do videoconferencing; you can also share your laptop screen, 3D models, data sets or other information, or share virtual meeting spaces.”[19]

Expat Kiwi Ian Foster, considered one of the fathers of grid computing, said there was no longer any excuse for New Zealand researchers not to be involved on the world stage. He told attendees at KAREN’s first international event in July 2007 that grid computing had changed the way researchers thought about problems, opening up ways to approach things differently. It was now possible to instantly make data or analyses available to research colleagues, instead of having to download from FTP[20]sites or wait until the research paper was published.

While the first generation grid computing was about on-demand access and batch computing, the second generation paved the way for service-oriented science, which in Foster’s view was the future of e-research, and would reduce time spent on mundane tasks. Service-oriented architectures meant developers could provide information tools as services clients could access, so manual data-processing and analysis could be automated.

Grid technologies could accelerate the development and adoption of service-oriented science. Grids were communities of people as well as computers, and were based on trust in order to share resources and services. KAREN enabled New Zealand researchers to be part of these communities. “You have the network connecting you to the world at 622Mbit/sec. All you need to do now is show up,” he said.[21]

Geospatial data view

Another use of KAREN and other high-speed infrastructures into local authorities, government departments, and universities was likely to be the ongoing development of New Zealand’s sustainability module for the international Digital Earth project.

The ability to zero in on a town or street on a Web-based digital map of New Zealand and access historical, community, ecological, and other data could be of enormous value in literally mapping our digital past, present, and future. Wider access to detailed public domain maps of cities, regions, towns, lakes, rivers, forests, coastal areas, roads, streets, and resources, including environmental data, was anticipated.

New Zealand had been officially charged with building the ‘sustainability’ model for the rest of the world as part of the Digital Earth project. Auckland City councillor and Digital Earth evangelist Richard Simpson wanted government and private sector holders of critical ‘non-sensitive’ map-based data to place this in the public domain as part of Digital New Zealand and the sustainability model.

Part of the solution was included in the Geospatial Information Strategy (GIS) approved in April 2007 as part of the e-government strategy, which itself had hooks into the Digital Strategy. It was designed ‘to improve knowledge about and access to assets owned, maintained and used by the government.’ It set out to co-ordinate and direct the way geospatial resources were developed and managed, ensuring compatibility and reducing duplication and fragmentation of effort. It recognised the government’s increasing reliance on such data for a wide range of activities from emergency services and national defence to utilities, resource management, biosecurity, and economic development.

The GIS would help ensure geographical information system databases, plus the various maps, land records, and related information provided to the public by government agencies and the private sector, were authoritative and up to date. A Geospatial Office hosted by LINZ was to be established. LINZ had the job of building and co-ordinating the model for public domain data; the Digital Earth Society would be a kind of watchdog to ensure this was done in an open way that enabled new stewardship models.

Simpson said dialogue to get different government agencies on board was ongoing and while the Geospatial Strategy was a step forward, it was still very much under the umbrella of LINZ, which was looking at it in the old way rather than with the new approach. The silo mentality about ownership and control of data inherent in the old analogue culture was still a deterrent to success. The old top-down model was no longer the way to work. “Having bureaucracies structured to be stewards for data about the sustainability of places is ridiculous when you consider the sort of technology that is now bringing universities together and opening up data and expertise for peer review by people who may be world authorities in other places and can make suggestions on how to improve or change things.”

There was a clear need to engage with private enterprise and have ‘community collaboration,’ rather than just focusing on government as the source of all data. Many larger nations were strongly engaged, and often had huge financial commitment from large corporations. “That makes it even more important for New Zealand to be seen to be getting on with the job and showing it can take the lead.” Simpson was convinced the only real way to meaningfully manage data was to index it through a place, geospatially. “It’s about getting access to information pertaining to sustainability of places; if you ever want to be a knowledge economy or an information society this is a fundamental thing. We’re not there until we can get to this stage.”

Digital Earth was about turning things upside down and could be quite radical in the way it opened things up, allowing transparency of governance – in effect delivering a whole new human right. That was one of the reasons Simpson was putting his energy behind Digital Auckland. “We need to show what we can do as a city, and start getting people on board who can deal with the tons of data the council is sitting on.” Funding would be used for broadband, urban fibre network ducting and other projects to ensure multi-gigabit Ethernet achieve wireless access for everyone by 2010.

This was important said Simpson, in order to attract creative industries and investment. Concurrently, he said, there was a need for councils to sort out important data that related to land-based resources. “Basically we’re awash with data and the challenge is, how do we open up the silos of council and other agencies?” Ideally, he said, infrastructure could be created so that it effectively became ‘a geo-spatial data clearing house’ to assist in the controversial proposal for a single Greater Auckland City embracing Auckland, North Shore, Manukau, and Waitakere.

Simpson insisted the one-city approach was a no-brainer. “It’s just ridiculous the way that it’s divided up, it’s in a dysfunctional state, we need to bring it all together, and the way to achieve that is to get the information working seamlessly.” There was an urgent need to come up with a common vision everyone could buy into, and Digital Earth provided that sort of framework. “There’s this big global vision, but you’re not ever going to get there unless you can get a city working together, and obviously if we can get this city working together and take it beyond we can start getting New Zealand working together.”[22]

Private fibre growth

While KAREN was finally allowing the country’s science, research, and academic institutions to catch up on high-bandwidth research and collaboration, the private sector had not been slacking. A growing number of examples of communities and independent network providers were co-operating to deliver high-speed fibre networks, among them, those that had received seed funding from the government’s MUSH and Broadband Challenge funds which encouraged development of open networks.

Murray Young, senior consultant with Teleconsultants, believed LLU had become a distraction from addressing the nation’s more important infrastructure needs. While LLU might break Telecom’s monopoly over the telephone lines and was an important step to increase competition, it would not deliver the broadband speed New Zealanders required for real economic growth. The present infrastructure using phone lines and limited fibre-optic cable was simply too slow or too limited in geographic coverage to provide real broadband to a significant proportion of the population.

“It’s a chicken and egg situation. Without the infrastructure we can’t provide the applications like telephony, video, pay TV and gaming that drive the mass markets. Without the mass markets there isn’t a business case to drive the investment in infrastructure. LLU and ADSL2, important as they are, distract from the fact that this is required now. The current network model with a main telco owning the infrastructure, with other players renting access and services for resale to end customers, will not deliver the speeds we need,” said Young.

The alternative model emerging around the world was open access networks (OANs). These networks were owned by an independent body, usually comprising local government and community organisations. Simple economics of shared costs applied, with the efficiency gained by building only one network and a common technology platform for use by all service providers. The OAN model may even be viable in rural areas where traditional telcos and ISPs may have been reluctant to offer their services. While the model was still in its infancy in New Zealand, Young said there had been uptake, particularly in education and government sectors, and it could easily be extended to business and residential customers.

Examples included the Hamilton Urban Fibre Network (UFN), a collaborative project led by Hamilton City Council with support from Wintec, Environment Waikato, the University of Waikato, Waikato DHB, and private partner Lite Up. Funding was being used to develop a publicly owned broadband infrastructure available to any party within the network coverage area, with important links between the key partners involved with the project. “New Zealand’s digital future cannot be left to the telcos alone to decide.

Broadband is an enabler, vital to the economic development of local and regional areas. Councils need to be developing specific broadband business plans to ensure that their areas get the best bang for their broadband buck,” said Young.[23]

The arrival of alternative long-haul backbone from independent dark fibre services provider FX Networks from September 2006 elicited a sigh of relief from a number of businesses and government departments who were looking at ways of saving money on big bandwidth. FX Networks had made an initial investment of $14 million in its Auckland-Wellington backbone, and was offering high-speed services through its own ISP, the former CRI-owned Comnet, which it had acquired in 2004.

FX Networks had cobbled together its own gigabit-speed nationwide network through investment and partnerships with carrier class operations, including pioneering CityLink in Wellington, which had its own ‘open access’ dark fibre. FX Networks owned 500km of the fibre in its backbone; a third of which was fibre leased from Ontrack, owner and manager of New Zealand’s railway infrastructure, and the balance from Kordia and Vector Communications. The company offered IP telephony exchange services and peering for other Internet providers and businesses. The urban network providers could buy long-haul capacity directly or through any of FX Network’s other partners.

FX Networks won the contract for the GSN national backbone and ISP services, and was delivering uncapped dedicated speeds between 100Mbit/sec–20Gbit/sec to strategic top 100 clients. Chief executive Murray Jurgeleit said once businesses were aware they could have this kind of bandwidth they began to do things very differently from when they felt constrained by the Telecom mentality. “If we give them attractively priced big pipe they start to rethink their data centre operations. ‘Do we need five processing centres around the country or only two?’ And ‘how can we do a better job of disaster recovery?’ They begin to change the way they think about structuring their hosting and applications to very good effect at times.”

FX Networks was also the first network aggregator for KAREN, making advanced networking much simpler for some of its customers. “The challenges of managing the complex router configurations required for a core KAREN connection can be overwhelming for some of our members,” said KAREN chief executive Donald Clark. Accredited network aggregator services would allow more people to experience KAREN as an enabler of their research, education, and innovation capabilities. In June 2007 FX Networks announced it would be spending a further $40 million over the next 18 months extending its private fibre-optic network to provincial centres and to Christchurch.[24]

In November 2006 state-owned Kordia, the re-branded BCL and THL Group, was actively engaging in alliances that leveraged its nationwide wired and wireless backbone capacity. It wanted to go much further than delivering competitive back haul and ‘inter-metro capacity’ for ISPs and new market entrants. Kordia had both fibre-optic cabling and digital microwave radio (DMR) capacity, enabling it to deliver broadband wireless access and a nationwide IP NGN, in direct competition with Telecom. However, Telecom, Vodafone, and TelstraClear and a number of second-tier carriers were also among its customers. One of the advantages of its wholesale ‘application-aware’ IPNetwork was that it could transport Ethernet or IP traffic with strict QoS and performance guarantees. In fact the greatest demand for its back-haul services was for VoIP and video.

Kordia’s $24.3 million acquisition of entrepreneurial ISP Orcon, in the midst of a $30 million upgrade of its technology to provide QoS-based services for clients, raised a few eyebrows in July 2007. Would Kordia be used by the government to broaden urban fibre network coverage and help education, health, and local authorities achieve their goals of nationwide interconnectivity? Was the government now getting into retail ISP services? Would Kordia use Orcon’s leading-edge network technology, geared for video-on-demand and IPTV, to give TVNZ a leg-up in the Internet game? Would Kordia be privatised?

Another wild card in the mix was government-owned Transpower, which operates the national high-power electricity transmission grid, and was in the midst of a major upgrade of its 500,000km fibre-optic communications network. It had been largely dependent on optical ground wire strung along the top of its towers to meet its communications needs but was engaged in a five-year, 44-project strategy to bring its technology into the 21st century. Performance from the new communications network, depending on the need, would range from 128kbit/sec to 155Mbit/sec with peak capacity greater than 2Gbit/sec. It had existing fibre from the top of the South Island down to Christchurch and would lease dark fibre to infill where required.

“Most of our fibre builds are down the side of the road because we don’t have land owner or easement issues. What is very interesting is that so many other companies and utilities are also investing in their own fibre routes. We are working with multiple parties to share that infrastructure so that we can deploy the most cost-effective solution,” said Jim Tocher, Transpower’s general manager of IT.

Transpower was partnering with various local and regional councils and utilities, with Alcatel-Lucent providing the architecture and overall management capability of the network. “I think there is an opportunity across New Zealand for parties to aggregate the capabilities for multiple providers. I don’t see us going out there and selling or buying back a great deal of bandwidth. It’s not our core business, but it is advantageous to us that there is effective infrastructure in place and we can get access to continued investment by others, as well as ourselves.”[25]

The thought of state-owned and independent back-haul networks linking arms with the urban fibre and MUSH roll-outs opened up a world of possibilities. For true national coverage however, Telecom always had to figure in the equation. Although the landscape was about to change significantly once the ‘devilish details’ of the Telecommunications Amendment Act were cemented, and it became clearer what shape the country’s main carrier might be in once the re-regulation engineers had unbundled it.

Open ground considered

The commercial carriers wanted secure anchor tenants before getting involved in fibre roll-out, so the Digital Strategy group tried to raise the awareness of prospective clients to improve the business case for aggregating network infrastructure. Former Digital Strategy chief Peter Macaulay believed there was common ground that should be explored across government-owned networks; Kordia, KAREN, health, education, and the urban fibre networks.

If the proposed health and education networks agreed to work together this would create sufficient demand to put fibre into under-served areas, and deliver ‘cash flow potential’ for network investors based on a five-year projection. “This is not about technology but forging relationships across organisations to ensure everything works together, it also creates an opportunity for major suppliers who haven’t got fibre between points A and B, to capitalise that with guaranteed business.”

A good catalyst was the informal Five Networks group, established at the end of 2005 and chaired by Internet pioneer Dr Frank March, a specialist advisor within the MED. The loose affiliation of interested parties within various government agencies was keen to ensure wider collaboration of high-speed state-owned network resources. It was clear the GSN and the KAREN advanced network were dependent on last mile fibre to get to some of their users. These were often the same kinds of customers the Broadband Challenge urban networks needed to get to. “In a way it was a marriage made in heaven,” said March.

A meeting of the three networks was called to discuss neutral peering points to ensure all the networks could interconnect. “While KAREN was restricted in that it couldn’t carry commercial traffic, it could still run over the same physical infrastructure, so we made sure everyone was in the same space.” Health sector IT planners who were looking to establishing a national health infrastructure were invited to a second meeting and then education joined, making up the Five Networks. “We were essentially a non-executive body that was comparing notes, and playing a co-ordinating role as required. The danger was that people planning away in their own patch weren’t cognisant of what others were planning. Education and health were reliant on a common infrastructure and that’s why we quickly got this level of collaboration,” said March.

By mid-2007, however, Cabinet sought a more formalised approach and decided the small gathering of IT gurus wasn’t appropriate and co-ordination would be better under the wing of the SSC, which already had responsibility for the GSN.

Preparing for impact

New Zealand’s slow Internet speeds were threatening to leave the nation out of the global economy, according to one of the founders of the Web, Larry Smarr, director of the Californian Institute for Telecommunications and Information Technology. He said New Zealand’s speeds were “a baby’s crawl compared to the spaceship” on the international scene where American, European, and Asian nations were rolling out fibre-optic cables directly to houses and businesses, creating super-fast networks that permitted high-end businesses to flourish.

At current speeds, it would only be a matter of time before New Zealand was left out of the tech-business loop. “We still think of the world as divided by land and ocean, but the Internet doesn’t recognise these divisions. A country has to offer a better value proposition and it needs to make sure there aren’t barriers – bandwidth is becoming that barrier,” he said.[26]

Any plan for more widespread fibre-optic implementation needed more input and guidance from the government, because ultimately it would affect the country economically, socially, and in every other way. TUANZ chief executive Ernie Newman said converged communications technologies, like electricity and railroads, were a general purpose technology that impacted on not just how we live, but where we live. “Such a technology comes along rarely, probably less than once a generation. Broadband is the glue that delivers converged technologies to wherever they may be needed. In turn this enables an almost infinite range of added-value devices and services that revolutionise people’s work and lifestyles.”

Newman said he was waiting for the day when transport authorities realised that a few strands of fibre buried under a motorway to facilitate telecommuting and sophisticated traffic management came at a microscopic cost compared to constructing extra motorway lanes. “Factor in the rapidly increasing importance of environmental sustainability and carbon credits and you add yet another incredibly powerful argument for communications technology to be viewed as a direct substitute for transportation.”

He warned against trivialising the impact of technology that brought us leisure and entertainment. “Life is not just about work; it’s about learning, socialising and having fun. The next generation of entertainment will be highly focused on the telecommunications networks as a core delivery mechanism, giving all of us more choice of content and flexibility to use it on demand. International computer games are not to be scorned – many of our young people are using them as a very powerful means to learn life skills and work skills. Anyway, leisure is an economic activity – your interactive video game can be my livelihood.”[27]

Next level internet

Simon Riley, a founder of NGI-NZ, the precursor to KAREN, said New Zealand needed to watch international developments and plan its own innovations because in all likelihood there would be a new Internet by 2012. “All the Obi-Wan Kenobis of the Internet community overseas have come to the same conclusion; the Internet is broken and it can’t be fixed. You can’t keep putting band aids on band aids, so they’re saying, ‘Knowing what we know now, what would we do if we had to start all over again with a blank page?’”

The Internet as we know it was doomed because of the ongoing challenges of security and the surge of video downloads. “If the next big driver for broadband is video or IPTV, then the reality is the current infrastructure cannot support it. If everyone starts downloading videos the whole global infrastructure will fall over.”

Riley’s apocalyptic views were supported by the November 2007 publication of a report from the Nemertes Research Group,[28] an independent US-based analysis firm that warned the Internet as we know it was heading for overload and ultimately brown-out unless backbone providers invested billions of dollars in new infrastructure.

It said a flood of new video and other Web content could overwhelm the Internet by 2010 unless backbone providers invested up to US$137 billion (NZ$182.5 billion) in new capacity – more than double their current intentions. “Our findings indicate that although core fibre and switching/routing resources will scale nicely to support virtually any conceivable user demand, Internet access infrastructure, specifically in North America, will likely cease to be adequate for supporting demand within the next three to five years,” said the study. The Nemertes study suggested demand for Web applications such as streaming and interactive video, peer-to-peer file transfers and music downloads would accelerate, creating a demand for more capacity.[29]

Dozens of projects worldwide aspire to displace, replace, rework, or radically improve what we know as the Internet. Some believed incrementally adding to what is already there is sufficient: if it ain’t broke why fix it? Others feared a breakage was imminent with all the fragile routing protocols, add-ons, updates, denial of service attacks, abuse from spammers, hackers, phishers, and other net nasties. The roadmap ahead would only add more complexity as a plethora of mobile devices, interactive applications, and Internet-enabled devices from phones, cars, home appliances, and RFID tags joined the rush to the Web.

The Internet is used by hundreds of millions of people daily and works well in many situations but it was designed for completely different assumptions, according to Dipankar Raychaudhuri, a Rutgers University professor overseeing three so-called ‘clean slate’ projects. “It’s sort of a miracle that it continues to work well today.” Even Vint Cerf, one of the Internet’s founding fathers, had come out in support of the clean slate pioneers, saying a replacement for the Internet is needed because current technology cannot ‘satisfy all needs.’ Researchers believed a new network could run parallel with the current Internet and perhaps even one day replace it.

Internet2, led by the US research and education community since 1996, remained the foremost advanced networking consortium, with its partners pushing the envelope with new technologies and capabilities that could have a fundamental impact on the future. The not-for-profit advanced networking group comprised more than 200 US universities in co-operation with 70 leading corporations, 45 government agencies, laboratories, and higher learning institutions as well as more than 50 international partners.

Internet2 was undergoing a transformation to become faster and more flexible, even offering scientists temporary circuits on-demand. It was becoming a hybrid network, with an option to transfer data using the traditional packet switching approach or the new optical capabilities which enabled users to acquire dedicated bandwidth for a fixed period. The first stages of the upgraded Internet2 went live in August 2007.

In 2003, the NSF funded the ‘100x100 Clean Slate programme’46 at CMU, Stanford, Rice, and FraserResearch, which had since developed novel approaches to network design, optical and wireless access networks, network control, and congestion control. This was expanded in 2006 through the Future Internet Network Design (FIND)47 project. That work fed into efforts begun in 2005 when the NSF began to envision a framework that would enable US researchers to build and experiment with new designs and capabilities that would inform the creation of a 21st century Internet.

Out of the bottle

The next generation Internet would have built-in security and functionality to connect every kind of device. A means to experiment with and test some of those applications was the Global Environment for Networking Innovations (GENI) initiative.

Researchers needed to start thinking beyond the current Internet and consider radical new ideas for continuing challenges such as Internet security and ease of use, said David Clark, a senior research scientist in the Computer Science and Artificial Intelligence Laboratory at the Massachusetts Institute of Technology. “The US Department of Defence had been pushing for adoption of IPv6 to replace the widely used IPv4, but the GENI project would go years beyond the current vision for IPv6,” said Clark, a long-time Internet security researcher who served as chief protocol architect for the US government’s Internet development efforts during the 1980s. He hoped the GENI Initiative would envision the Internet society’s needs 15 years ahead. It would deliver new core functionality for the Internet, including naming, addressing, and identity architectures; enhanced capabilities, including additional security architecture, and a design for high availability and new Internet services and applications.[30]

In 2007 NSF researchers were working up the design for the new test environment and had contracted advanced technology solutions firm BBN Technologies – who had helped pioneer the original development of ARPANET – to serve as the GENI Project Office. “GENI will give scientists a clean slate on which to imagine a completely new Internet that will likely be materially different from that of today. We want to ensure that this next stage of transformation will be guided by the best possible network science, design, experimentation, and engineering,” said principal investigator and project director Chip Elliott of BBN.

Meanwhile Stanford University was working on its own ‘clean slate design’ for the Internet in 2006. While complementary with GENI, it was to be funded by seven industrial sponsors, with researchers at the same location, enabling close collaboration across disciplines and among graduate students. It would bring together world-class researchers from several disciplines outside the traditional field of networking, including services, equipment, semiconductors, and applications, to deliver a broader, fresher perspective. Stanford researchers claimed the current Internet had “significant deficiencies that need to be solved before it could become a unified global communication infrastructure.”

They didn’t believe the shortcomings could be resolved by the conventional, incremental, and ‘backward-compatible’ style of academic and industrial networking research. Its proposal focused on ‘unconventional, bold and long-term research that tried to break the network’s ossification.’ It wanted to try to visualise what the Internet would look like in 15 years and ‘knowing what we know now,’ figure out how to start again with a clean slate, and design a global communications infrastructure. The interdisciplinary research programme would create a loosely coupled breeding ground for new ideas in collaboration.[31]

Wamberto Vasconcelos, a lecturer in computing science at Aberdeen University, said he was sceptical about what was driving the push for a replacement for the Internet. “One of the key features of the Internet from its very beginnings was its anarchic approach. It has always grown organically – things that are good about it survive, while innovations that are not simply die off. If there was to be a wholesale replacement for the Internet I can see a great deal of interest and influence being brought to bear on it by major corporations such as Microsoft and IBM, not to mention the influence of government. Users – those who make the Internet what it is – could ultimately lose out. If major companies and government have more control over networks there would be better security, but the price users pay would be less privacy.”[32] The clean slate programmes being run by prestigious universities including Princeton, Stanford, Rutgers, and the Massachusetts Institute of Technology, were not expected to bear fruit for at least another ten to 15 years.

In May 2006 Web inventor Tim Berners-Lee insisted the Internet should remain neutral, and efforts to fragment it into different services should be resisted. “Attempts in the United States to try to charge for different levels of on-line access Web were not ‘part of the Internet model’,” he said in an address to a conference in Edinburgh on the future of the web.

He warned that if the US decided to go ahead with a two-tier Internet, the network would enter a dark period. “There’s one Web and anyone who tries to chop it into two will find that their piece looks very boring.” Berners-Lee, director of the World Wide Web Consortium, believed in an open model, based on the concept of network neutrality, where everyone had the same level of access to the Web, and all data moving around the Web was treated equally.

The ‘right’ model was that any content provider could pay for a connection to the Internet and put any content on to the Web with no discrimination. Even co-inventor of the Internet Protocol Vint Cerf stated in 2006, “The Internet was designed with no gatekeepers over new content or services. A lightweight but enforceable neutrality rule is needed to ensure that the Internet continues to thrive.”

Where the market demands

Dean Pemberton, senior technical specialist working on telco and enterprise networks and a long serving NZNOG member, suggested any changes to the Internet infrastructure in New Zealand would follow the old maxim of appearing first where there was sufficient market share and customer demand to justify investment.

“When people roll out these things they look at who they can sell it to and what the drivers are.” For example the whole argument about IP4 versus IPv6 hadn’t progressed much since 1991. “You still can’t convince 90 percent of the Internet population to get interested. As long as people can get their rich media content over the systems that get used today then we’re not going to see a push for anything new, no matter how cool it is. The minute there’s a killer app, whether its media sharing or the new version of World of Warcraft, that only works with a particular change, then you can start to convince a whole lot of people to be on it.”

The revised deadline for running out of IP4 addresses had been extended to 2010. The Japanese government and American Defence Department were mandating IPv6 support; the dot.nz registry was running it, as was the advanced KAREN network. And while a number of ISPs and carriers had been toying with it, there were no clearly stated benefits. “No one’s asking for it and no one’s building it into their systems. We’ve got sneakier. Rather than allocating more addresses we’ve found ways around it, and perhaps in the future we’ll come up with NAT2 (network address translation version 2) to better handle what we’ve already got. We have learned from some of our mistakes and in other cases we haven’t: some ISPs are still allocating 16 million IPv6 addresses for every user, so we’re still wasting these things.”

Pemberton said we would be remiss if we didn’t keep an eye on international advanced network developments to ensure we weren’t slipping behind or heading in a different direction. “We need to make sure we’re current with what is going on in the rest of the world but also acknowledge that New Zealand has a unique position in the market and our own innovations may well provide the answers we need.”

It’s easier to roll out for four million people than 20 million in the United States; New Zealand had been used as a technology incubator in the past for things like Eftpos systems and barcode scanners in supermarkets, he said. “We’re doing all we can but we’re a slave to the ‘tyranny of distance’ as the old Split Enz song goes; we’re at the end of a big wet piece of glass and nothing will change that. Most of the content we access, including emails and Web traffic, originates from outside New Zealand and the economics of that mean we’re not going to be able to enjoy some of the technologies that get used.”

Meanwhile, there were no clear winners in the next generation Internet stakes. “They all have amazing potential but they’re yet to show this. While we have KAREN, you have to remember that’s not the Internet in New Zealand, the commodity Internet here is what my mum and dad dial up to.” The logical thing would be to find out how advances in Internet2 relate to what New Zealanders are doing and applying small changes incrementally. “Look at the Web 2.0 applications which were born from company portals and are now being used in portals for end users. These lend themselves to our environment very well while we’re still waiting for the roll-out of high-speed broadband,” said Pemberton.

Swedish rounding

While there were major changes going on in relation to the next generation Internet, Peter Macaulay, who was elected president of the InternetNZ lobby group in July 2007, believed the greatest value for New Zealand would come from Europe. “In the US there’s such legislative backlog, they’ve become anally retentive with their own pockets of innovation being attacked by local telcos who claims this is unfair competition.” If we take the best of what’s coming out of the world, New Zealand may yet become a WCL, he said.

Simon Riley also believed New Zealand had to plan its own way forward. “Generally speaking any municipally driven broadband initiative in the US, whether it’s wi-fi or fibre, has been attacked not only by telcos but ‘astro turf’ organisations; fake ‘grass roots’ lobby groups, think tanks, and business organisations often backed by telcos, who say broadband roll-out should be left to the marketplace and not involve taxpayers’ money.” He said smaller locations were more compatible with the New Zealand experience, including Canada and Sweden, where the political environment was more sympathetic to government intervention and there was usually an incumbent and a couple of competing players.

“I think we can create our own models but based on parallel developments; for example, in Sweden the Urban Network Association (SUNA) said in 2000 that they would have optical fibre throughout the country in five years, and they’ve pretty well done it, offering 100Mbit/sec in some places.” It was Sweden’s success that prompted Riley, who organised the Digital Cities Conference in Wellington in November 2005, to invite Lars Hedberg, secretary general, founder and former chairman of SUNA, to be a keynote speaker. Unfortunately a conflict in schedules meant he had to make the presentation by broadband video, but he nonetheless got his point across.

Hedberg said, once the European Union allowed competition, municipalities started looking into costs and found they could save a lot of money by laying their own cables. The move in Sweden – population eight million – kicked off when many municipalities, which also happened to own power companies, began to lay fibre whenever they dug up the streets for electrical cables. Local businesses began asking if they could have access to connect their offices to storehouses and on it went.

Of the 290 communities in Sweden, about 200 had an OAN in 2005, where anyone could lease dark fibre on equal terms. These networks served schools, health care centres and local councils as well as widely scattered groups of small islands and areas of low population density. Open access was defined as networks ‘where the network owner doesn’t compete with the service provider.’According to Riley, Sweden initially set the bar at 50Mbit/sec for OANs, and was now lifting the bar to 150Mbit/sec.

Big, hairy, audacious plan

If the government could pull together enough momentum and resources to launch a gigabit shared network for its departments and agencies (GSN) and an advanced science and education network (KAREN), there must be hope for other big pipe projects. Health and Education, for example, were certainly hoping to leverage this kind of collaboration. Extending Probe beyond 512kbit/sec and filling the rural and remote gaps would no doubt be a priority for Digital Strategy V 2.0, as would extending the reach of the urban fibre networks.

However, TelstraClear chief executive Allan Freeth seemed concerned that all the open access networking, both with and without government assistance, was cutting into his company’s potential market share. He told a Wellington Chamber of Commerce breakfast in mid-August of his fears about the ‘headlong rush’ into broadband network investment by some city councils, including WCC which planned to spend $40 million to build a 100km broadband network. With many councils holding the vision of affordable high-speed connectivity for all, the definition of affordable, and the question of subsidies, capital investment and who would pay needed to be fully considered, he said.

Dr Freeth said local authorities in particular needed to understand how their local economies could potentially benefit from their investment in high-speed broadband, and how fibre-optic networks would be used and funded. “People think you build the network and that’s it. These networks are very sophisticated, large-scale computer networks with all the problems of software.” Maintaining and upgrading TelstraClear’s network cost $100 million–$150 million a year. Telecom’s soon-to-be-unbundled copper loop network would finally allow telecommunications companies wide access to New Zealand communities, but Telecom’s ‘restricted’ investment in the network meant consumers would not get the speeds they required. The answer, he said, was a fibre network running at least to customers’ kerbsides at a cost of more than $1 billion, but both main carriers faced the reality that network investment needed to provide the certainty of returns expected by company shareholders.

“High-speed connectivity will allow us to do many things but installing a broadband connection does not make people innovative, it does not in itself throw up new ideas and does not automatically mean more GDP, ” said Freeth. Maximising the economic potential of broadband required the development of a ‘cultural change’ and start-up incubators to foster the new ideas that would translate into new products and services.[33]

However InternetNZ’s Pete Macaulay said open access networks needn’t be complex or difficult to deploy or manage. “You keep the advanced stuff simple and accessible but you need adequate capacity. If you don’t have the capacity it becomes hugely complex with technical solutions like deep packet inspection required to run the network. If we can avoid getting into that end, by having sufficient local capacity, then everything is faster, better and less expensive and robust enough to have multiple paths so you can run anything across it.”

He was determined to push forward with what he called his ‘big, hairy, audacious plans’ for pervasive open access fibre-optic networks. That access depended on having a big enough pipe into neutral peering points around the country. He would be watching with interest now nearly everyone, except TelstraClear, was back in the peering game. Telecom’s change of behaviour meant there was a greater willingness to achieve common goals for broadband interconnectivity than ever before.

“We need to give Telecom credit for its 29 points of peering which will enable the fundamental principal of keeping local traffic local. You can’t do high-speed networking for entertainment or HDTV without it. If you have several thousand users trying to access an HDTV server in the US it would swamp the Southern Cross cable. However if you send it down one time, and give people access off local servers, there’s no problem. You end up with a regional servers handling high volume content for local communities.”

Banking on broadband

Macaulay continued to ride his personal hobby horse that pervasive broadband would be dramatically assisted by the creation of a ‘broadband bank’ to encourage investors to build more open fibre networks. He was positively evangelical about the wider impact of his model, which he called the ‘lily pad effect,’ which, with the right cash injection, could connect open fibre networks across the country.

“If you build back-haul capacity in all the cities and towns the next thing you know it’s only 10km to join them together – suddenly the whole of New Zealand joins up. We don’t have to have a huge amount more back-haul fibre to do that.” What was needed, he said, was for Telecom and other commercial carriers to realise the future was not based around operating in a ‘walled garden.’ “If you connect everything up there’s room for everyone to have a share of the traffic.”

New Zealand was languishing because of lack of connection and Telecom was failing to deliver what the country required. The Broadband Challenge fund encouraged fibre build-out, but only $24 million was set aside for the six successful networks. More needed to be done, preferably through a government-facilitated fibre fund. “I’d like to see pilots for fibre to the home initiatives. This is an absolute key to getting everything working. As you get the entertainment flows to the home it will create the demand and funding for major back-haul changes, creating an entire feed-back loop.”

Macaulay was lobbying for around $250 million to kick off what he hoped could become a $1.75 billion investment repository. Funding would come from banks and other financial institutions. Anyone could apply for funds to build a network as long as it was open for anyone to use. He was keen to see some kind of policy to streamline resource consents to lay fibre, and encouraged co-operation between carriers and utility organisations to notify each other when there were roadworks, so ducting or additional cabling could be added.

“The Resource Management Act makes it very difficult to lay fibre. We need to get people to understand that fibre is inert; when you cut it, it doesn’t make sparks, it doesn’t pump smelly stuff all over the place.” It’s important, he said, to get councils to understand their role in this community-owned infrastructure because it’s a natural monopoly and also allows commercial organisations to share fibre or run their own. He had no objection to Telecom taking advantage of any future fund as long as the cable and the conduit was open to everyone.

The goal was to roll out fibre to 85 percent of the population, enabling up to 20Mbit/sec access. The above ground component – the routers, switches, software, and operational stuff – was normal commercial-risk equipment with a three-to-five-year life. The big investment was needed in ‘long-life, low-latency, low-risk underground community-owned infrastructure.’ The MED distanced itself from the proposal in 2006 as did David Cunliffe, but knowing Macaulay’s determination and his lobbying influence it would be back on the agenda at every opportunity.

What’s a few billion?

InternetNZ chief executive Keith Davidson believed the attitude preventing more rapid roll-out of fibre, could be traced right back to the ‘bad old days’ when Telecom was telling people they didn’t need broadband. “The community was very upset by this. Then we had Project Probe and – surprise, surprise – Telecom got to provide its broadband solutions in 14 of the 18 regions. In other words the government paid Telecom to install its equipment, so Telecom could continue to overcharge for its not very useful broadband product.”

And he said New Zealanders were kidding themselves if they believed ‘broadband’ was Telecom offering us some version of DSL. “Until we have people committed to the concept of fibre to the home, we won’t have a usable network for the sorts of things the future holds. If we had gigabit fibre, Sky TV and other digital services could be delivered without impacting anything else. There has to be a more robust method of delivering TV than having your signal disappear every time it rains.” He suggested an investment of $5 billion to get fibre to all New Zealand homes would still be peanuts compared to what could be achieved with it. “If we endorse broadband and technology we stand to increase our GDP hugely, and if we don’t do anything we stand to suffer in GDP terms. We can either go backwards or forwards.”

Copper can’t cope

The second major report on the state of our ICT infrastructure was delivered by the New Zealand Institute in its ‘Defining a Broadband Aspirations’ which urged the country to shift focus from penetration to increasing the speed of the network.

The report claimed economic benefits to the country through pervasive higher speed broadband could range from $2.7 billion to $4.4 billion a year. “There is a significant cost to waiting. The longer New Zealand waits, the more economic value it will forego and so [it] should approach the investment in fibre with urgency.”

Report author David Skilling said his methodology was based on a conservative estimate of global and local growth rates and likely benefits over a five-year period, and in many cases the economic benefits could be much higher. “File sizes shared over the Internet have increased significantly and New Zealand will soon hit the bandwidth wall. Residential bandwidth demand will continue to grow strongly and will soon exceed the capacity of New Zealand’s network. Copper will soon be unable to provide for the demand expectations of high bandwidth homes in the UK. The same will be true in New Zealand.”

The report claimed that by 2012 most homes would demand more downstream bandwidth than ADSL or ADSL2+ would be able to provide. Within a decade, it was likely speeds of 50–100Mbit/sec would be demanded in many parts of the market. “New Zealand needs to invest for the future, not simply for today’s demand.”

Quality communications infrastructure was an important driver for attracting foreign investment in an increasingly competitive global environment. There were numerous examples of proposed foreign investments or projects not proceeding because of perceived inadequacies in the quality of New Zealand’s communications infrastructure. “New Zealand is more likely to receive greenfields foreign investment in the weightless economy than in, say, manufacturing.” In order to attract this type of foreign investment, world-class communications infrastructure is likely to be a particularly important driver, said the report.

Accelerated adoption could deliver an additional $16 billion in production and growth benefits and strengthened innovation. Failing to invest might mean new opportunities going offshore, our relative position to other nations worsening and GDP losses rising as high as $1 billion–$2 billion, said Skilling. Fibre to the premises was the way to future-proof the infrastructure, with the resulting increase in speed opening the way for full potential of economic value to be captured. Back-haul and offshore links needed to be upgraded as part of this process to reach at least 75 percent of the population including all towns with populations greater than 20,000 by 2018. That, he said, meant fibre investment must commence with urgency.

“The aim should be to front-load the investments so as to rapidly capture economic value quickly. New Zealand must move quickly or much of the economic value will be foregone. A specific pathway must be mapped out with timelines for deployment to different market segments. Any debate should be on actions to be taken over the next few years rather than whether this aspiration is defined in exactly the right way.” The roll-out of fibre should commence by mid-2008, said the report.

Race for media money

There was potential revenue growth of $600 million in the digital media sector if higher speed pipes were provided for film and other content-rich industries. The high cost and low availability of fast communications was impacting on the productivity of creative companies having to physically transfer files or hugely compress them because the current cost and availability of high-speed networks was prohibitive.

The New Zealand film sector was generating more than $2 billion in revenues and 60–70 percent of this was earned in exports. New Zealand’s Asia-Pacific film sector had grown by 65 percent to almost $50 million since 2005 as production capacity shifted away from large United States productions towards the region. “Access speeds of competitor nations were growing rapidly and unless New Zealand could play as an equal it would miss out on potential growth. The benefits of faster broadband access at lower cost could lead to 15-20 percent improvement in productivity in the digital media sector, which regularly needed to move gigabyte files between locations,” Skilling said in the report.

The next stage of the Institute’s project would focus on defining a specific pathway to fibre. “New Zealand should develop an efficient pathway to the rapid roll-out of fibre with an initial focus on investing in high-value segments from which benefits can be realised rapidly.”

Within weeks of the two major reports condemning our low levels of investment in telecommunications, new Telecom boss Paul Reynolds agreed to accelerate the carrier’s NGN plans.

Originally Telecom claimed there was a billion dollar shortfall in what it planned to spend and what was needed to achieve the government’s Digital Strategy goal of 5Mbit/sec speed to 90 percent of New Zealanders by 2010. Reynolds said Telecom had undertaken a ‘re-planning process,’ looked at its annual investment strategy and determined how much could be pointed towards the IPNetwork. “We’re planning to create a world-class next generation network and broadband footprint for New Zealand – it’s as simple as that,” he said. A revised $1.4 billion plan to accelerate the next phase of the NGN footprint would deliver speeds of ‘up to’ 20Mbit/sec to all towns with 500 or more phone lines over the next four years. This now formed part of a legally binding commitment to the government as part of the operational separation process.

David Cunliffe said the move signalled the incumbent was facing the future and investing. “That ups the ante on the rest of the sector to get in as well … the real challenge now is for the public and the sector to get their heads around the fact that a sea change is occurring.”[34] The additional investment in fibre across rural and urban New Zealand would support sophisticated offerings including faster broadband up to 20Mbit/sec, VoIP and new technologies and services that might require even faster speeds in the future.

“Operational separation has given Telecom the opportunity to re-plan its broadband strategy and accelerate the upgrade of the existing network in ways that will support New Zealanders’ aspirations for the digital age. Specifically, we plan to install more fast ADSL2+ technology, more fibre to the street, and create a next generation network capable of supporting a wide range of world-class IP-based services for our wholesale and retail customers,” said Reynolds.[35]

Telecom faced fines of tens of millions of dollars if it reneged on its commitment to deliver 10Mbit/sec speed to 80 percent of New Zealanders and 20Mbit/sec to 50 percent of the population by 2010. Telecom’s network division would also be forced to give rival carriers the same deal on services it gives its own retail operation. It could face fines for breaching that commitment plus $500,000 a day for every day any breach continued. “Plus I’ve got all sorts of other swords of Damocles in my toolbox If I need them,” Cunliffe told NZ Herald telecommunications writer Helen Twose. Reynolds said Telecom would “run like greyhounds out of the gate and get the job done,” within four years.[36]

Serious or a sidestep?

Telecom’s next generation IPNetwork had been underway since 2001. A forward-looking statement in July 2004 outlined a decade-long ‘next phase’ where a billion dollars would be invested in rolling out fibre and next level DSL to bring broadband to most New Zealanders. In 2006 Telecom had offered to increase investment in its back-haul network by spending hundreds of millions in rolling out fibre to almost all towns. That offer was withdrawn when the government decided on re-regulation.[37]

In the lead-up to the operational separation decision Telecom had been considering the investment required to deliver broadband speeds of 20Mbit/sec to 100 percent of customers. It was suggested its early 2007 investment plan could have delivered 20Mbit/sec to about 30 percent of customers by 2011. Various numbers were being tossed around about the cost of completing that task, most of them in the billions. “Numbers like $100 million are doable, $3 billion less so. If the government wants to target higher levels of penetration or performance there’s a gap that needs filling,” said Telecom CFO Marko Bogoievsky in March.[38]

Then in April Telecom chairman Wayne Boyd said Telecom was prepared to spend only one third of the $1.5 billion required to achieve 5Mbit/sec to 90 percent of New Zealanders by 2010 if operational separation went ahead.[39] In August Telecom, which had been chastised locally and internationally for many years over its failure to reinvest in its network, had forecast capital expenditure of around $950 million for the 2007–2008 year. So was there some creative accounting going on here?

If it was going to cost $1.5 billion to get 5Mbit/sec to 90 percent of Kiwi why did Reynolds think he could get at least 10Mbit/sec out there for less? And how much of the investment being promised by Telecom over the next four years had already been foretold, withheld, and was now being put back into the play again?

Then you had to take into account Telecom’s own statement that only 75 percent of customer lines were capable of speeds of more than 6Mbit/sec, and it was doubtful you could get more than 8Mbit/sec over 65 percent of them. While Telecom claimed 93 percent of New Zealanders could get broadband through their phone lines, the average speed for the 38 percent who had taken up the service by September 2007 ranged from 2Mbit/sec to 3Mbit/sec.[40] If that didn’t present a clearer idea of how difficult the journey to 20Mbit/sec was going to be, another sobering fact was the dire shortage of contract labour which had already been cited as one of the reasons even existing broadband goals were unlikely to be met.

Reynolds had made the proviso that the pricing argument wasn’t over and Telecom planned to engage energetically to get the deal it wanted. Final pricing for wholesale broadband services and LLU would be critical ingredients in escalating its DSL2+ and fibre to the street roll-out, he said. Clearly Telecom had cut some kind of deal during the Commerce Commission hearings over pricing of LLU access.

The Commission’s decision, having considered ‘other benchmarking factors,’ increased the fees competitors would have to pay for using Telecom’s unbundled network by 14–20 percent per line from the interim announcement in July. The new numbers had ISPs smarting and would inevitably mean lower margins and higher costs to consumers. So what was achieved by that? You had to wonder how serious the government was about ensuring pervasive high-speed connectivity across the country when its own anti-competitive unit was raising the barrier to entry.

The announcement that Telecom planned to roll out two million metres of fibre and deliver 3600 cabinets to take fibre closer to the nation’s urban homes by 2010[41] resulted in what sounded like a collective jaw dropping across the competitor space. With a few exceptions those planning investment in LLU seemed horrified. However it was surprising they hadn’t been a little sharper. While their focus was on the copper, Telecom had been planning ahead. Its next generation roadmap hadn’t exactly been a big secret; it had even disclosed plans to accelerate that process when it met with wholesale customers in June.

The greater concern seemed to be that Telecom was cabinetising the same exchanges competitors were installing their local loop equipment in. In other words Telecom was vacating the premises to take its equipment closer to its customers just as its competitors were gearing up to share its network in the old centralised exchanges. Was this shades of Wellington and Christchurch, where Telecom had targeted competitive services in the exact spots where TelstraClear was delivering a full service? Those locations quickly became the most competitive in the country with the lowest prices across all services. What was about to happen in the local loop wasn’t exactly what the government had in mind when it set out to improve broadband competition. Sure, competitors would most likely get access to the cabinets if there was room, but that was another strategy altogether and it could require another period of protracted debate to clear the way.

Ahead of the mandate for firming up operational separation Telecom appeared reluctant to advise competitors about the specifics of its roll-out and was attempting to skirt around the government’s clearly stated guidelines for separation. InternetNZ was concerned there were serious deficiencies in Telecom’s proposal.[42] Its executive director Keith Davidson described Telecom’s draft undertakings as ‘death by a thousand cuts,’ which lacked detail and subtly diluted requirements for behavioural changes. He was particularly concerned about the lack of detail and disclosure around the roll-out of future NGN services, including fibre. He was also disturbed that Telecom appeared to be ignoring the ban on short-term performance incentives across separated Telecom divisions. In Telecom’s draft undertaking up to 80 percent of the wholesale manager’s incentive could reflect the performance of Telecom as a group.

TUANZ said Telecom faced a juggling act in terms of its legal obligations to the government and its shareholders which were at cross-purposes with the intent of operational separation – which aimed, among other things, to improve the availability of services to Telecom’s competitors.[43] There were concerns that sorting out such conflicts of interest could put the practical economic lifecycle of unbundled local loops at risk.[44] David Cunliffe warned Telecom to get its separation undertaking back on track or he would act to do so himself.

View from the summit

The overall outcome of November’s Digital Strategy Summit echoed previous talk fests: the desire for New Zealand to lift its game in the global market and to use ICT to help it win the race was clearly articulated. There was a strong focus on the need for greater resource sharing between carriers, greater levels of competition and stronger industry-government collaboration.

Cunliffe challenged a new generation of innovators to take New Zealand’s ideas, companies, sector and economy to the next level. “We need to be better than the global market in our chosen niches – the race is getting faster and we must play to win.” Greater partnership was needed between government, the ICT sector, business, and the community. He said the government’s long-term vision was for fibre to the home[45] and warned Telecom to go much further with its NGN plans. Unless the country got a lot closer to meeting the government’s OECD objectives, more intervention would be necessary.

There was no question the serious players would persist. Unbundling the copper was too good an opportunity to walk away from even with slimmer margins and greater investment required to get into the essential cabinets. The bigger questions of equitable fibre access, the skills crisis, and how to leverage technology to boost our exports were also pressing. The government wanted to see 90 percent of households with 20Mbit/sec access speeds by 2012 rather than Telecom’s proposed 80 percent at10Mbit/sec.

The long-term goal was to have fibre to the home; despite competitors’ cries of foul over cabinetisation plans, the local loop would still remain viable in many areas, but FTTN was ‘simply necessary.’ Cunliffe said New Zealand needed to think outside the square when it came to funding infrastructure builds and the government was open to subsidies for cable projects, including funding its own cable across the Tasman to improve Internet traffic price and capacity.

In 2007 only 54 percent of Telecom’s rural customers could get broadband services and its average net investment in rural areas had been ‘negligible.’ Cunliffe hinted Broadband Challenge funding could serve rural areas and outer urban areas if communities could link fibre loops together.[46] National party IT spokesman Maurice Williamson proposed that all government entities make their land available for the laying of fibre; for example, laying empty pipes down the side of the road when building new roads so that any company could lay fibre there. He suggested there be a mandate that every new subdivision have not only water, sewerage, and electricity to each house but a fibre connection as well.

Rod Drury, chief executive of Xero, suggested the government do a debt-type investment in infrastructure, in order to get open access broadband out to as many people as possible. An investment of $2 billion–$3 billion seems like a small amount of money, considering the possible return on investment.[47] Broadband seer Simon Riley agreed that an injection of $3 billion–$4 billion – perhaps through posting infrastructure bonds – would make all the difference, even if it had to cut a deal to roll the assets of Kordia into a new company and get other big players; especially lines companies, to take a minority shareholding.

“It’s not a big step, essentially the lines company would be open access anywhere.” The government was already investing in broadband through KAREN, GSN and Kordia and was talking about health and education. In fact the government was the biggest customer for these services, but was still failing to fully co-ordinate its investment. Perhaps it was now time to consolidate its resource to benefit everyone, said Riley.

While some progress was being made on the three wealth, creation areas isolated by the government – biotechnology, ICT, and the creative industries – not enough was happening. We needed to have a long, hard think about what we needed to do as a nation to deliver software as a service and the investment required, not just in the three-year political cycle leading up to an election but with long-term goals in mind. “If we are going to talk about a ‘weightless economy’ as our future, as opposed to shipping things around and all the carbon miles that entails, we have to be able to deliver something to the world, and it’s all about applications,” said Riley.

Certainly it was hoped that Telecom’s apparently magnanimous offer to expedite its NGN would not give the government any reason to duck out from under industry pressure to lift its own game, create clear incentives for further private sector investment and lead the way with its own billion-dollar broadband booster.

Footnotes

[1] Giving Effect to the Five Themes, Cabinet Policy Committee, 21 August 2006

[2] http://www.soumu.go.jp/joho_tsusin/eng/presentation/pdf/0705221.pdf

[3] http://www.nyquistcapital.com/2007/08/28/the-proving-ground-of-ntt

[4] http://www.dynamicitkorea.org/koreait_policy/koreait_policy_4444.jsp

[5] Keith Newman, ‘The Next Merciless Leap,’ MIS Magazine, September 2001

[6] Interview with Neil James, 2001

[7] Keith Newman, ‘Educating for Efficiency, Re-balancing the IT infrastructure,’ MIS Magazine, July 2002

[8] Keith Newman, ‘Superhighway to close divide,’ Telecommunications Review, August 2003

[9] Heather Wright, ‘Funding boost for Internet2,’ Dominion Post, 22 September 2003

[10] ‘Demand Case for a New Zealand Advanced Network: Research Education and Innovation,’ released, 14 October 2003

[11] Keith Newman, ‘NGI head takes parting shot at slow science,’ Telecommunications Review, June 2004

[12] Juha Saarinen, ‘TelstraClear wins $43m REANNZ advanced network contract,’ Computerworld, 26 April 2006

[13] Stephen Bell, ‘High-maintenance Karen will be costly to run,’ Computerworld, 12 September 2006

[14] Peter Komisarczuk, ‘An ICT CRI?,’ New Zealand Computer Society (NZCS), 22 June 2007

[15] Paul Clearwater, ‘KAREN proves its worth,’ The Line, 13 March 2007

[16] Computerworld staff, ‘The National Library joins KAREN,’ 22 May 2007

[17] ‘A Roadmap for Activities 2007–2009 Discussion Document:http://www.karen.net.nz/assets/Uploads/Documents/Roadmap-consultation-doc-v3-4.pdf

[18] Research and Education Advanced Network New Zealand, Statement of Intent 2007–2010, May 2007

[19] Ulrika Hedquist, ‘Conferencing on steroids connects NZ universities,’ Computerworld, 25 July 2007

[20] File transfer protocol

[21] Ulrika Hedquist, ‘Service-oriented science is the future, says expat,’ Computerworld, 18 July 2007

[22] Keith Newman, ‘Strategy shifts sought,’ e.nz magazine, September 2007

[23] Teleconsultants Apropos Newsletter, July 2007

[24] Randal Jackson, ‘FX Networks in $40 million network push,’ Computerworld, 5 June 2007

[25] Matt Freeman, ‘Jim Tocher manages massive fibre network and core systems modernisation at Transpower,’ Telecommunications Review, 10 September 2007

[26] Ian Steward, ‘NZ Internet speed ‘a baby’s crawl,’ The Press, 13 June 2007

[27] Ernie Newman, chief executive, TUANZ, Address to Telecommunications Summit, Auckland, 25 June 2007

[28] http://www.nemertes.com/ii

[29] Grant Gross, ‘Study: Internet could run out of capacity in two years,’ Computerworld, 21 November 2007

[30] Grant Gross, ‘US National Science Foundation floats next-generation Internet,’ Computerworld, 19 September 2005

[31] http://www.cleanslate.stanford.edu/CleanSlateWhitepaperV2.pdf (URL was active at time of research)

[32] Alison Hardie, ‘Why the search is on to find alternative to Internet,’ The Scotsman,17 April 2007, http://www.news.scotsman.com/scitech.cfm?id=586392007

[33] Jon Hoyle, ‘No fast track to riches, Freeth warns,’ Dominion Post, 13 August 2007

[34] Helen Twose, ‘Plan takes fibre to NZ streets,’ NZ Herald, 27 October 2007

[35] Telecom press release ‘Telecom commits to tough undertakings on operational separation,’ 26 October 2007

[36] Helen Twose, ‘Huge fines hanging over Telecom,’ NZ Herald, 27 October 2007

[37] Tom Pullar-Strecker, ‘Unbundle, then invest – Tuanz,’ Dominion Post, 5 February 2007

[38] Tim Hunter, ‘Split Telecom twice as nice for the market,’ Sunday Star Times, 18 March 2007

[39] Tim Hunter, ‘Telecom strikes back,’ Sunday Star Times, 29 April 2007

[40] Helen Twose, ‘Huge fines hanging over Telecom’

[41] Paul Clearwater, ‘Telecom delivers more detail on cabinetisation,’ The Line, 23 November 2007

[42] InternetNZ submission: http://www.internetnz.net.nz/issues/submissions/2007

[43] Matt Freeman, ‘Industry bodies cry foul over Telecom’s operational separation plan,’ The Line, 26 November 2007

[44] InternetNZ media release: ‘Telecom draft separation plan disappoints,’ 23 November 2007

[45] David Cunliffe, media statement: ‘Challenge laid down for New Zealand’s digital future,’ 28 November 2007

[46] Paul Clearwater, ‘Cunliffe warns Telecom there is more government intervention to come,’ The Line, 28 November 2007

[47] Ulrika Hedquist, ‘Summit: Let’s get positive about broadband, says panel,’ Computerworld, 3 December 2007