Telecom holds back the tide

Inevitably the world will be linked by broadband networks and if New Zealand gets a leading edge this could help us move from the traditional primary production sector into value added industries capable of matching international niche market demands. Computing and communications technologies will generate an entirely new way of life where distance and international markets will no longer be an impediment to product development. Dr Ian Forrester, New Zealand’s chief scientist, January 1992.[1]

In 1989 the statutory monopoly status enjoyed by the old Post Office was completely removed, allowing anyone to compete with Telecom. There was a change of management, an extensive round of restructuring and cost-cutting with hundreds of jobs axed, and the replacement of outdated systems.

With the sale of Telecom now clearly on the agenda, and a major shake-up occurring right through the public sector, nothing seemed certain any more. The government was making a major investment in Telecom that included installing high-capacity fibre-optic cables with up to 72 fibres around the CBDs of the main cities, and establishing its first mobile network, which had 2300 cellular ‘brick phone’ subscribers by the end of the year.

Telecom announced with great fanfare a new service for data connections called integrated services digital network (ISDN), which could integrate voice and data services over 64kbit/sec data streams. What did it mean? Well, for many years afterwards some cards in the industry insisted ‘I still don’t know.’

In 1990 Telecom sacked a further 10,000 people and announced a $300 million profit. Meanwhile it was inundated with customer complaints about double billing from its locally developed integrated customer management system (ICMS) billing system, now costing $73 million – triple the original budget. The monopoly carrier still found $2 million to spend on a PR campaign to try to ease public concerns. It also made sure to give the impression that cutting-edge solutions would soon offer businesses and consumers everything they’d dreamed of. The new dream technology, it said, would be better than anyone could imagine – two channels of 64kbit/sec ISDN would be rolled out but also ‘broadband’ ISDN (30 x 64kbit/sec channels) should be available to all New Zealand homes and businesses by 1995.[2]

On 14 June 1990 the government signed an agreement with Bell Atlantic, Ameritech, Faye Richwhite, and Freightways to purchase Telecom, and then sell down 40.1 percent of the shares over three years, with the goal of maximizing the level of New Zealand shareholding.[3] Telecom was sold for $NZ4.25 billion, the biggest business deal in New Zealand’s history, and the sixth- largest deal in the world that year. Part of the arrangement was that the new owners would abide by the ‘Kiwi Share,’ an undertaking that free local calling would continue for residential customers; rental for a phone line wouldn’t rise faster than the cost of living unless Telecom’s profits were unreasonably affected, and phone-line rentals for residents in rural areas would not be higher than in the cities.

The government insisted the sale of Telecom would help pay off growing public debt and ensure consumers a better deal. The year before the market was fully deregulated Telecom chief executive Dr Peter Troughton in a letter dated 8 June 1988 to SOE Minister Richard Prebble, described a transparent company structure comprising six regional companies and a separate long-distance subsidiary. Within this structure, potential competitors would be able to make “realistic judgments” in which “charges for interconnection will be based on costs” and “the details of specific charges will be available to the Commerce Commission if required.”[4]

The reality leading up to and after the sale was very different. Public debt and consumer frustration continued to rise. There were now more complaints about Telecom than any other business. In fact in 1990 a special unit within the Commerce Commission was working overtime dealing with complaints about Telecom. Among the complainants were six businesses battling the obstacles placed in their path as they attempted to compete in the newly deregulated marketplace. They brought action under Section 36 of the Commerce Act, which covered anti-competitive activity and the legal battles continued for years.

Picking up the pieces

When the National Party came to power in October 1990 under Jim Bolger, it faced technical, legal, and competitive chaos in the aftermath of the Telecom sale. Minister for Information Technology Maurice Williamson was often criticised as the ‘hands-off’ minister who refused to take the approach many thought necessary to ensure a fair deal for Telecom’s competitors in the emerging market.

“The thing that no one really understood about the Telecom sale was how bad the privatisation had been in terms of stuffing up what needed to be done.[5] During the course of 1990 the big debate was raging in the Labour Government with Prebble and Lange and others about whether they should privatise the whole shooting box or just the competitive bits of Telecom like long-distance calling, overseas calling, data networks and cellular and keep the local loop as a SOE. Helen Clark was deputy prime minister and Michael Cullen was the associate minister of finance, so it’s not like it was different people,” recalled Williamson.

“However the argument Prebble kept putting up was, ‘If you do that, you’ll either get low bids or no bids; we need the money.’ Treasury thought the company was worth $2.6 billion, and Prebble said if we wanted to get that, or possibly even more, we had to sell it unencumbered by all sorts of rules and regulations. He actually went to Cabinet trying to fight for the sale to have no rules and regulations, not even the Kiwi Share. In the end, the conditions in the sale document; universal service, no more than CPI increase, free local calling, were mainly driven by Lange and others who had a phobia about foreigners owning the phone company and having total control.”

Prebble later said he insisted on the Kiwi Share to give the necessary protection, but he wasn’t in favour of it to start with, although the government was happy to accept the $4.25 billion cheque. Williamson said a number of the big finance houses (including Bloomberg and Merrill Lynch) believed the new buyers had overpaid dramatically for the company. “They knew that if they put in the rules or regulations and said you must open yourself up, and there’s an interconnection regime about who else can use the local loop and so on, they’d get nothing compared to what they did.”

Only a few months after the deal was done Labour was out and National was in, with Williamson as IT minister. “I’m sitting there thinking, I’m a huge bloody respecter of property rights, and if somebody has gone and paid $4.25 billion for a property right without any rules or regulations, it’s a bit rich for the people who then sold it to say, once the cheque was banked and cleared, there are quite a few rules we’re going to put around what you can and can’t do.” In Williamson’s own words, he was ‘stuffed’ even though in his personal view Labour should never have privatised the local loop part of the business. “It’s a natural monopoly, it will never be anything but a natural monopoly, and we’ve spent at least 18 years trying to argue about how you get that natural monopoly back.”

One way would have been to buy back the local loop, perhaps returning the purchasers’ $1.5 billion. “That would have been a perfectly sensible thing to do but I can tell you now Ruth Richardson would have looked at me and said: ‘I don’t know what you’ve been smoking, Mr Williamson, but it must be bloody good stuff because we’re not buying back public utilities. On the back of everything else we’ve got a deficit on our hands of ginormous proportions.’” Indeed, during the Treasury briefings of 1990 it was clear the country had a multi-billion dollar debt which was only going to escalate the way the previous government had been spending. “We had gone out and promised in the campaign that we were going to put out 1000 extra police on the beat, get rid of student fees and that the surtax would go. If you put all those on, the cumulative debt that we would have encountered in our first term in government would have been over $12 billion.”

In 1991, as arranged, Bell Atlantic and Ameritech took up their 98 percent shareholding in Telecom, which was promptly listed on the New Zealand, Australian, and New York stock exchanges, the new owners offering 724.5 million ordinary shares at NZ$2 each. While moving rapidly away from the legacy of the tired old Post Office network, the more commercially focused Telecom raised eyebrows with a price hike. From November its Megaplan (2Mbit/sec premium ISDN service) tariffs rose 114 percent, from $28,000 to $58,000 a month. Regardless, it got the contract to connect PBXs between the United Building Society offices in Christchurch and Auckland with a $500,000 Megaplan voice and data link, aggregating 30-ISDN channels.[6] TUANZ and ITANZ (Information Technology Association) members were aghast at the cost and slammed Telecom for its second huge increase. In August a call was made for the Commerce Commission to investigate the monopoly carrier for ‘making excessive profit and using anti-competitive measures.’[7]

Meanwhile in 1989 the Alternate Telecommunications Company, a consortium involving the local Todd Corporation and the state-owned Railways Corporation, in partnership with US telecommunications company MCI, had been granted network-provider status. It planned to create a nationwide network using the fibre-optic cabling that straddled the main railway lines. In its opening gambit it installed two 512kbit/sec circuits to Hamilton, terminated in state-of-the-art Cisco 4500 routers. It immediately began battling Telecom over an interconnection agreement. An ‘in principle’ agreement was reached by August 1989 but six months later final details were still being debated. An ownership change equally shared by TVNZ, Todd Corporation, MCI and, British Telecom resulted in a name change to Clear Communications, which finally began offering a full tolls service on 1 April 1991, exactly two years to the day since deregulation. A number of issues including rural connection remained outstanding. Another entrant that looked as if it had potential to make a difference in the market was the Kiwi Cable Company on the Kapiti Coast, which announced on 1 June 1989 that it had registered as a network operator.

Days later, in a grand preview of what might evolve in this most deregulated of environments, New Zealand broadcasting and telecommunications experts engaged in an interactive satellite-based videoconference on 6 June 1989, between Victoria University and Ohio State University to discuss the opportunities offered by the deregulated environment. The Wellington students had been exchanging emails with their US counterparts using the Bitnet service and voice connections. They were simulating what life might be like if widespread broadband ISDN were available in an effort to help students come to terms with the ‘overall electronic communications environment which would be at the heart of university education in the future.’[8]

International options were opening up as well. In September an agreement to connect the 124,000km, 560Mbit/sec PacRim loop into Hawaii with the Tasman 2 cable between New Zealand and Australia was signed, giving even greater credibility to the hi-tech dreams the optimists were enthusing about. In its Vision 2000 plan Telecom boldly declared it intended to deliver for full optical switching, wireless broadband, ISDN, metropolitan networks running at hundreds of megabits per second, on-demand integrated broadband and mobile services. After four years of talk it finally announced a basic rate ISDN (2 x 64kbit/sec) service, which, apart from the high subscription and user costs, had additional fees now that local business call charging had been introduced.

With deregulation supposedly creating a more open environment than anywhere else in the world, talk turned in earnest to maximising the opportunities, particularly in telecommunications, where the terms Internet and World Wide Web were shifting from arcane usage to mainstream conversation. Since 1988 the Internet had doubled in size every 6 to 15 months. The ever-expanding, decentralised network of networks now spanned 50 countries. A Moscow physics laboratory and computer programmers co-operative became Russia’s first connections late in 1992 and Ecuador and Turkey were connected in early 1993. By mid-1993 the Internet comprised 11,000 connected computers being accessed by an estimated 10 million people.[9]

The term ‘information superhighway,’ coined by US vice president Al Gore in 1991, evoked hi-tech alternatives to the railroads and highways that cross-crossed the globe during the industrial revolution. Now the pace of technology and the speed of roll-out of new overland, undersea, and satellite links were moving the world inexorably into the dawn of the information revolution.

The market was changing fast. Pay TV company Sky Television was now broadcasting on the UHF band and Canwest, owner of TV3, launched its second free-to-air channel, TV4. Aware of the burgeoning on-line market in other parts of the world, Fujitsu entered into a partnership with CompuServe, with a plan to set up two large databases in Australia and New Zealand. It had 700 users across both countries and believed it would have 100,000 local users within five years. There were whispers of a second mobile carrier entering the market to compete with Telecom’s 025 network, which now had around 58,000 subscribers.

Many alternatives were appearing on the data distribution dial with, now you could fax it, microwave it between buildings, beam it from a satellite dish, packet switch it, EDI it to an electronic mailbox, broadcast it on spare TV or FM signals, send it via modem down the telephone wires, PABX it down a leased line, let the bureau update your systems overnight, or simply post it using NZ Post. A ‘work in progress’ sign was up over the broadband infrastructure and while ISDN was being talked up at every turn, true competition in the data market was still an urban myth, and there was no sign of the government or Telecom taking the Internet seriously. Talk of the imminent ‘convergence’ of computing, broadcasting, and telecommunications only added to frustrations.

Still looking for backbone

From the early 1990s the development of an Internet backbone in New Zealand was heavily dependent on the willingness of the universities and research organisations to sew their patchwork developments into a cohesive whole. There were still many different types of incompatible networks in operation. With Waikato University now operating as the central gateway for the Internet in New Zealand, and universities across the country eager to share links to the international academic and research community, it was time to formalise an administrative body.

A preliminary meeting of heads of university computer centres and computer science departments was held at the Rankin Brown building at Victoria University early in 1990. The big issue was who would lead the way forward. Neil James, who chaired the meeting, asked: “How would we formalise what had previously been informal, instead of the Computer Science Department at Victoria University being the centre of connectivity? We needed things set up on a more professional basis.”

They agreed to establish Kawaihiko; loosely the ‘the shoot or, knowledge or enlightenment’[10] in the Maori language, under the auspices of the vice chancellors’ Standing Committee on IT. Its goal was to “grow a vision on behalf of New Zealand,” and it was agreed everyone should aim to move to the common TCP/IP protocols as soon as possible to ensure traffic flowed seamlessly. The Kawaihiko network began operation in April 1990 and the remaining universities, who had been struggling for various reasons to get on board, agreed to replace their ‘number 8 wire’ PC-router links with 9.6kbit/sec leased lines to the network, which were in place on June 14.[11]

When New Zealand’s first international Internet link was established in 1989, its share of the costs was divided equally among six universities. As the costs of the international bandwidth consumed by different organisations could be highly disproportionate to the relative capacity of their links to the gateway site, volume charging for international bandwidth based on traffic to and from each university was adopted late in 1990.

To achieve some ‘predictability,’ charges were based on bands of traffic use. A site using between 200Mb (megabytes) and 300Mb of international traffic a month would pay a set fee so long as their traffic remained in that band. If their consumption exceeded 300Mb for a single month only, no additional charge would be made. If it exceeded 300Mb for a second consecutive month, they would be charged on a higher use band. This system was subsequently revised, but each university, and later each of the three distinct research and education networks, paid a share of the cost of international bandwidth closely related to its use of that bandwidth.[12]

There was a glimmer of hope that the government might step in and provide funding for the science and research network around the time the National Library joined up with Kawaihiko. Gateway manager John Houlker said interest in helping relieve the pressure on maintaining the research backbone came when Dr Ian Forrester, the first chief scientist of the newly formed Ministry of Research Science and Technology (MoRST), accompanied him to Hawaii for a PACCOM meeting in 1991.

“He got the idea very quickly. He guided us to funding for collaborative research projects within the research sector.” After Forrester left, the understanding within the Ministry of Science about the Internet vanished, according to Houlker, until Phillip Lindsay, the chief information officer at MAFnet, pointed out an opportunity for funding and prepared a bid on behalf of the research and academic community.

False positives

A directive from Cabinet in 1991, ahead of a major restructuring in the academic and research sector, stated that funding would be provided for science and research organisations, universities, interested government departments, and the National Library to form a national computer network. Phillip Lindsay used the opportunity to get all the parties to try to form a national research network.

A large gathering of heads of departments and interested parties was called to expound on the merits of the idea and a steering group was formed to develop a proposal for Cabinet. “We worked out how it could be developed but then there was this big dust-up. The DSIR had developed its own networking equipment, which it wanted everyone to use, but MAF and the universities preferred to use Cisco, the up-and-coming networking company. The DSIR people made it really difficult. In fact a mediator was even called in to try and get an agreement between the parties,” explained Lindsay.

A week before the deal was to be signed the DSIR made a separate bid to the Ministry of Science, saying it had a better way of doing things; rather than a joint project with the universities, it should be running the Internet. The lack of agreement undermined the proposal and while some funding was made available to the universities for basic equipment and DDI links, it was nowhere near the original application. DSIRnet got funding for new networks and the national network ended up with a combination of different routers. “We were grateful for any crumbs we could get in the end,” said Lindsay. The failure to agree on a strategy to attract the promised funding further exacerbated existing tensions between the universities and the DSIR.

According to Network Wizards, 1193 people in New Zealand were connected to the Internet in 1991 and 535,000 hosts connected worldwide. In anticipation of increased use of the international gateway the satellite link between Hawaii and Waikato was upgraded from 9.6kbit/sec to 14.4kbit/sec. Then in July 1991 traffic growth between Victoria and Waikato universities, the two busiest nodes on the network, resulted in those leased line links being upgraded to 48kbit/sec. In February 1992 the satellite link to Hawaii was replaced with a 64kbit/sec link to NASA’s Ames Research Centre in California.

Initially there were bottlenecks everywhere in the universities’ Kawaihiko network. Every effort was made to cache or archive anything from offshore that the local community might be interested in. Newsgroups and whole FTP archives were downloaded when capacity allowed and stored on a succession of 500Mb disks for others to access.

Victoria University had a close relationship with the DSIR, but sharing DSIRnet capacity to get across the country to the Waikato gateway wasn’t always efficient, Mike Newbery said. “They didn’t have a lot of capacity available during the day and I recall one time having to wait until 2am to download a 1Mb patch for the TRW terminal server. We had asked Telecom to run the network for us but they simply weren’t interested in what we were doing unless they could sell their existing products. I had a number of conversations where I would ask for a certain amount of bandwidth and they would offer a tenth of the bandwidth I wanted at 100 times the price I wanted to pay, then offer a whole lot of added value services we didn’t want at any price.”

Newbery’s requests often met with incomprehension from both carriers. He recalled after one Kawaihiko meeting, when it was obvious something needed to be done to provide much higher capacity for the network: “I had just returned from a conference in Scandinavia where the government supplied the academic network Nordunet with dark fibre over which you could run OC48s (155Mbit/sec). Back in New Zealand I had another conversation with Telecom looking to improve our bandwidth. I said OC48 would be nice but I’ll settle for E3 (34Mbit/sec). Telecom said ‘What’s an E3?’ So I patiently explained it to them. They said ‘we can do anything if you pay for it’ and began to talk about vastly inflated prices. I had the same conversation with Clear, which at least had the good grace to look stunned and explained they couldn’t do it because they didn’t have the facilities.”

Commercial break

Andy Linton’s core role in the Victoria University Computer Science Department was still cutting code and acting as postmaster to ensure email got through and the right addresses, protocols, and domains were being acquired and used correctly. However the debt his department had run up continued to be a major pothole in the highway Victoria was developing.

In effect it had already become a de facto ISP in that it was allowing other parties outside the university to connect to the world via its servers and email gateways. The department was in a difficult situation. Telecom wasn’t offering it any special deals for use of its data circuits, and there were no acceptable use policies stopping it from moving traffic around the country, so it felt quite justified in passing on some of the costs. In fact the payment it received allowed the department to ‘unofficially’ trade its way out of debt and start making a handsome profit.

“We were soon at a point where there was enough revenue coming into Computer Science to not only pay the outstanding phone bill but pay for another staff member. I’m sure we were in contravention of all our acceptable use policies, which stated there was to be no commercial activity,” said Linton. Soon the little enterprise began expanding its influence to the broader business community and sharing its knowledge with the industry. Linton and others from the department attended Uniforum conferences to talk about how they had achieved their connectivity. “There was a bit of evangelism going on, if you like. Another piece of the puzzle was that it gave us the chance to talk about the Web stuff that was coming up.”

According to head of department Professor John Hine, no-one quite knew what to do with a department that had gone from a debt situation to having a revenue stream of a quarter of a million dollars. “There was no research funding available here in those days. Other universities in the US and Britain got research funding for this kind of thing and didn’t have to charge but we had to begin charging per kilobyte; 25 cents per 2000 characters or an A4 page, that would go overseas in three to four hours in the dial-up cycle. Postage was about $1 and a fax call would have been around $5, so it was still very economical.”

A growing array of customers would dial in to the banks of modems in the Computer Science Department and at one stage even Auckland University began to experiment and handed its modems over to Victoria to run. It also began a feed to pioneering ISP Actrix, run by John Vorstermans and Paul Gillingwater, out of a garage somewhere in Wellington. They’d been paying Telecom “an arm and a leg” and Victoria offered to help them. He said Victoria sublet them a room. “They moved to the university on condition that they would offer sensible rates for students who wanted to dial up the campus network. That meant we were being paid for renting a room and they were no longer paying for links across the city,” said Linton.

Another major obstacle to an industrial-strength academic and research network came when the Crown Research Institutes Act 1992 passed into law, splitting the DSIR into nine Crown Research Institutes (CRIs). While they remained in government ownership, and would continue to undertake research for the benefit of New Zealand, they were required to operate as profit-making enterprises. This was in direct contrast to the DSIR’s previous functions: “To initiate, plan and implement research calculated to promote the national interest of New Zealand” and “to collect and disseminate scientific and technological information, including the publication of scientific reports and journals.”[13]

The Research Division of MAF was similarly charged to undertake scientific research and extension activities in agriculture, as set out in the MAF Act (1953). While there was little change in their roles, everything now had to be done with the shareholders’ interests in mind under the Companies Act. Even though many disputed whether science and research and development could be undertaken on a profit-alone basis, it was simply assumed that the only alternative to the public, service model was the commercial model.[14]

The DSIR and MAF were metamorphosing and the various CRIs that previously had a common voice were now forced to compete with each other. Getting agreement on a common network probably delayed the formation of the new Tuianet (sew together or bound together) management group by about six months. Tuianet was administered by the Tuia Society, which had the mandate for connecting New Zealand to the Internet and the internetworking of universities, the National Library, the CRIs and the Ministry of Research, Science and Technology. Its first links went live in June 1992.[15]

According to Neil James, who continued to chair the meetings, the turmoil around the changes within the new CRIs made a common aim difficult. The people brought in to head up the CRIs were running separate businesses and looking at how they could sustain this in competition with each other.” With hope of further government funding now gone, it dawned on the universities that everything would have to be done within existing budgets.

However, the CRIs still had to network with each other, and access to the international gateway at Waikato was an imperative. The old communications-and-technology issues remained despite political changes. John Houlker was asked to join a project set up by Department of the Prime Minister and Treasury to look at how a common Internet backbone could be managed and where the CRIs would fit in. Houlker recommended frame relay as the underlying ‘layer 2’ (switching, security, and quality of service) network, which could be managed by a telecommunications carrier, while still allowing Tuianet partners to make their own decisions about Internet routing on the top layer.

“There had been a difference of opinion about the Internet architecture we should run and what kit we would use. The DSIR wanted to use the routers they were making and we wanted to use commercial routers being made by Cisco. My concerns were partly tied back to problems I had with connecting DSIR routers in the past which had cost me several months in getting the backbone network operating.”

Ideally frame relay would make it much simpler to set up permanent virtual circuits so Tuianet members who wanted to connect could do so at reasonable speeds without any fuss. Houlker had been negotiating with Netway Communications, a joint venture between Telecom and Freightways, and after learning Telecom was planning to launch frame relay urged them to bring their plans forward six months for a pilot.

Tuia continued to work behind the scenes to manage domains and cover all the administration issues relating to ensuring nationwide coverage, and as demand grew, increase the speed and availability of bandwidth. IT staff at universities around the country were now operating Cisco routers, switches, and other advanced equipment required to access and share Internet email, FTP and newsgroup services, and typically knew more about this technology than most telco engineers.

In fact Tuianet members had become concerned at the apparent lack of knowledge and even interest in the Internet by Telecom, Clear, and others in the telecommunications market. Few seemed to know anything about IP Networks. The issue came up again when the group felt it had to take the lead to get better performance for the Tuianet backbone and the international circuit when discussions shifted from leased lines to frame-relay technology. “We had to virtually push Telecom and Clear into understanding and deploying frame-relay technology,” said Neil James.

The frame-relay equipment from Netway Communications arrived in July 1992. John Houlker reacalled, “Netway had real trouble managing the frame-relay circuits within the performance specifications customers wanted, as it was designed to burst up in speed, based on demand. “We benefited from that hugely because the way they had it configured we were paying for a committed rate of about 9.6kbit/sec which was quite modest but it ran at 256kbit/sec most of the time.” Don Stokes said it was intended as a full mesh network but Netway was using Hughes switches. “This seemed a bit stupid because they weren’t easily configurable and didn’t have the kind of matching, mapping and traffic management capabilities of the Stratacom equipment they later used.”

Science as a profit centre

The DSIR network group had also been involved in a collaborative project with Netway Communications to implement frame relay across its routers. It was first demonstrated linking a Telecom office in Wellington to the DSIR office in Gracefield. By June 1992, when DSIR closed its doors, the network consisted of more than 70 routers interconnecting over 20 LANs at DSIR sites from Kaitaia to Scott Base. There were in excess of 50 hosts (VAX, Unix, and Netware servers). The establishment of the CRIs led to the break-up of DSIR’s Physical Science’s Computer Operations Group and its various computer bureau ‘outposts’ were absorbed into the dominant institute at each research campus.

The Network Engineering Group was overlooked in the restructuring, and left without a host CRI. A number of options were explored. The group established a staff-owned company – Advanced Communications Electronics Ltd (ACE Routers), with a view to further developing its router technology and other communications equipment, and operating the communications network. After numerous visits to the various CRI implementation managers, it became clear they were not confident such a small company would have the stability required. They would, however, be happy to purchase services, provided the group was operated from within a CRI. After considerable negotiation, Industrial Research Limited (IRL) agreed to become home to the Network Engineering Group, and on 1 July 1992, the DSIR Network became CRInet.

An important requirement of the CRInet implementation plan was to establish an integrated research network linking the 10 CRIs with Tuianet. After many meetings, technical staff from the universities, MAFTech, and DSIR Network Operations Group agreed on a solution, recalled CRInet’s Dr Peter Whimp. Frame-relay access circuits were installed at each university and at the major research campuses around the country: Mt Albert; Ruakura; Gracefield; Wallaceville; Lincoln; and Invermay. DSIR Palmerston North already had an existing fibre access circuit back to the Massey University computer room.

To foster co-operation between the universities and the CRIs, the network was fully meshed, with committed information rate (CIR) levels chosen to reflect expected traffic flows. Day-to-day operation and management of the network was split: MAFtech retained responsibility for all AgResearch units; the universities took control of their campus connections and the old DSIR network group, now part of Industrial Research as CRInet Operations Group, took responsibility for provision of WAN and Internet services for eight of the ten CRIs. Whimp claimed the Tuianet management group was remarkably open, and higher-level management was undertaken by a committee consisting of representatives of all three groups. The initial implementation was fully meshed, with private virtual circuits (PVCs) between all sites, but it was acknowledged at the outset that not all would be required. About 12 months into the operation a review of traffic patterns allowed a number of circuits to be dropped, and the ‘savings’ to be funnelled into more heavily used circuits.

Don Stokes, from Victoria University’s Computer Services Centre, confirms that apart from the clashes over preferred equipment there were also philosophical differences between the way the DSIR and the universities operated their networks. The universities still operated at arms’ length, using the Internet mainly for access to international services. The CRIs were building, operating, and administering their own internal nationwide network and only some of the traffic was IP-based. “There was never a meeting of minds as to how a national research and education network was going to operate,” claimed Stokes.

He describes Tuianet as a bunch of frame-relay nodes talking to each other in what was a watered-down research and education network, but still able to carry internal traffic for the CRIs. It was “a fairly loose confederation” with the concept of “proximal and distal” sites. “A distal site was one that came through one of the Tuianet connected sites which they were then responsible for. For the purposes of charging all the traffic that went through Victoria was charged to Victoria, so we took responsibility for all the networks that were attached to us.”

Phillip Lindsay wasn’t too concerned about the side issues. His group, AgResearch, was just thankful for the incremental performance boost. The geographically dispersed government agency with five major campuses around the country could finally operate as one. Frame relay was more affordable and certainly faster than X.25. “Tuianet was fantastic. It was the first time those groups had come together and we couldn’t have achieved a nationwide network without their co-operation. These were new times and there was new technology everyone was drooling over. It seemed that nothing was impossible. Netway Communications was far more entrepreneurial than its owner, Telecom, and there was a strong commitment from all parties to make this new frame-relay technology work.”

With frame relay, participants knew they had a minimum speed to work with, and if they needed more the network could burst up to accommodate that. “In the early days when Tuianet users were the only ones on the network the performance was fantastic and we were getting far more than we were paying for. As a lot more customers came on the performance declined. We had such good performance at the beginning but as it became more commercial we had to start looking at other options,” said Lindsay.

Tuianet chairman Neil James said it was soon realised the Internet was not going to remain a private research and education network. “It was going to be a lot bigger than that, so we made this deliberate decision to upskill the communications sector, put our leased lines up for tender and try and get three different suppliers involved.” From late 1992 the more robust network, which now connected all seven universities, began operating at 48kbit/sec over Telecom’s frame-relay network, with digital leased line links to smaller sites. The increased flow of traffic forced the doubling of capacity to a 128kbit/sec satellite circuit for the international Internet link to NASA’s Ames Research Centre in California.

Jim Higgins, information systems director with the New Zealand Audit Department, and a long-time member of the New Zealand Computer Society (NZCS), got a call from Neil James at Otago University who wanted to know if he would chair a group to establish the foundations for a national research and education network, involving all the universities, CRInet and the National Library. Higgins sensed there was something else he needed to know before committing. He called James back asking, ‘Is there something you’re not telling me here?’ and so learned about the conflict, hidden agendas and vested interests, including fears by the newly formed CRIs, that they might be overtaken by the new infrastructure.

There were plans to further boost bandwidth but Tuianet’s capabilities, structure and ongoing role were causing some concern. “The idea was that once all of this was put together it would become the backbone of the New Zealand Internet. From the research I did it seemed to me this was the world’s first multi-point ATM (asynchronous transfer mode) network. There were point to point networks in the US but no one had run a multi-point at such high speeds as 1.4Mbit/sec,” recalled Higgins.

At the Tuia Society AGM later in 1993 it was agreed that the non-commercial organisation was no longer representative of New Zealand Internet users. Internet governance was at a crossroads; it could no longer be contained within the academic world, and at this stage Tuia was the only body willing to take responsibility.

With Waikato charging Tuianet for international traffic and Tuianet having to pass on costs to its members, it became more important to ensure the accuracy of who was being charged what. Nevil Brownlee from Auckland University came up with a second-generation packet meter. His NeTraMet traffic meter designed to help universities and ISPs with ‘traffic accounting’ or monitoring and charging for traffic at per megabyte rates was released in October 1993.

Reflecting on that period Neil James said all the Tuianet group wanted to do was act as a catalyst. “I think we succeeded too well in some ways.” One of the reasons for the eventual demise of Tuianet was the carriers ‘playing games with us.’ Representatives of the network began to use the collective buying power of the universities and CRIs to get the best deal possible for leased lines, access to better frame-relay speeds, and other services from competing carriers.

“The last time we went on one of those rounds we screwed them down and chose the supplier we wanted for various services. Then both Clear and Telecom undermined the Tuia deal by going out separately to each university. The next time we met as a group we discovered we had lost our collective bargaining and they’d effectively broken our alliance,” said James. That’s where Tuianet lost control of the development of networking for research and education in New Zealand. “They all went with different suppliers and we rarely met again as a collective.”

There was no point in the group appealing to government for help after the divide and-conquer attack by the carriers. “The blame can be laid with the New Zealand Government for not understanding the revolution in communications. It was in no mood to be involved in anything that it thought the market might manage. It’s been proven the market doesn’t respond well in the area of data networking; nowhere in the world has a totally laissez faire open market worked properly,” commented James. The hands-off approach, the lack of understanding about where the Internet was going and the refusal to legislate or regulate, he said, meant the government was late to the Internet game.

First-in, first-served Kiwi writes web FAQ

Possibly the first fully populated Web pages put together in New Zealand were created by Victoria University undergraduate Nathan Torkington. He also wrote the first World Wide Web frequently asked questions (FAQ) list to be published on the Internet, and thereafter in numerous publications around the world.

Torkington made his foray into the Internet in 1990 during his first university holidays working at the Computer Services Centre at VUW. He was keen to dive as deeply as he could into the pool of data that was being built up around the Internet and discover the best way to navigate this massive library of information.

“I did all the sort of usual things like Telnetting into MUDs (multiuser dungeon, domain or dimension)[16] and BBSs in the States, which was quite a novelty at the time. I got into Archie and Gopher when they came out and WAIS. I was just being a kid. I really enjoyed the connection to other people on the other side of the world and trying all the new bits of software that were coming out. Playing with that kind of wide area type server information system was intoxicating.”

Torkington was active on Usenet and ran a local FTP, archive mirroring some core international material including the Simtel 20[17] archive and material from Project Gutenberg.[18] He was encouraged by Victoria’s computer services director Frank March who sought his help planning a campus-wide information system which Torkingtn ended up running. At that point everyone knew about Archie and Gopher but Torkington pointed them in another direction. “I thought something better was coming and asked if I could show them this thing called the World Wide Web.” His first Web page was an HTML view of his FTP archives.

Torkington is quite comfortable declaring that he created and ran the first real web site in New Zealand. “I remember there was a guy at Canterbury who had downloaded the software and put up a ‘Wahoo it works kind of site’ but I actually stuck information on mine and I got right into it. I was part of the WWW Talk mailing list which was where all the software developments happened in those days.” He became a regular on that mailing list, posting questions, digging out details, and discussing essential elements of design with Web pioneers, including the father of the Web, Tim Berners-Lee, and Mosaic founder Mark Andreessen.

“I was there at the start and we were talking about how to make dynamic Web pages, how to do searches and things like that. Back in those days we didn’t even have CGI, which is the world’s simplest way of doing that. We had to add code right into the code of our Web server; there was no handy modularity. So we were talking about the best ways to do that, what forms should look like and how to lay out tables. It was really pioneering stuff.”

Torkington had some input into the unique identifier tag that HTTP uses to determine whether or not a file has changed. His proposals for compressing data to save disk space and having the server uncompress it for transmission over the Web were interesting, particularly in New Zealand where disk space was like gold, but that wasn’t exactly world changing. His most endearing contribution was probably ‘the cancel button’ in the open dialog box of Mosaic.

Of course being first means others want to know how you did it. Torkington was soon sought after to share his Web development skills with other academics. “I still have nightmares about having to teach academics how to make HTML pages. Because the tools were really primitive, you really had to do the work. We weren’t even using Windows 95 at that point.” He pulled together everything he knew and created the first Web FAQ, which was published on 6 April 1993 in the Usenet newsgroups. It was hugely popular and was ultimately reprinted in many books. It was his FAQ, and its subsequent updates that taught many people what the Web was about.

So how was it that a university graduate from New Zealand ends up telling the world about the Web? “Well that’s the great thing about the Web isn’t it, nobody knows you’re a dog, and nobody knows you’re an undergraduate from New Zealand. It was just about having the initiative to do it at the right time. It’s still true on the Internet: the first people into a technology become the gurus and it doesn’t matter where they are.”

Red lights and red faces

Just as the Internet was moving beyond the confines of the university and research community, its seamier side, the one represented by alt.sex and related newsgroups and early red-light district web sites, had attracted considerable media coverage and political debate.

There was an attempt to make ISPs responsible for the material being downloaded. Trevor Rogers’ Technology and Crimes Reform Bill, tabled in parliament in 1993 while he was still a National MP, would have resulted in anyone proven to have been transmitting ‘objectionable material’ having their phone line disconnected for up to five years. His private member’s bill stated that any communication with foreign web sites or telecommunication services which transmitted objectionable content would also be considered an offence and ISPs would be ordered to cut those services off. After several attempts to have it passed into law, and an outcry from the Internet community, a select committee eventually rejected the bill in 1997, stating that it was “virtually impossible to control material that is brought in from other countries.” Even if you could, such a law would have Bill of Rights implications.

Interest in the Internet in New Zealand was growing exponentially. The number of users had gone from hundreds to about 10,000 in a few years. To cope with demand, the speed of the NASA–subsidised Internet satellite connection with the US backbone doubled in July for the third time in three years to 256kbit/sec. Rather than the clumsy PACCOM[19] moniker for the access gateway at Waikato, the term NZGate began to be used. In June and again in August, Simon Lyall’s ‘Internet Access in New Zealand FAQ’ was posted on Usenet to teach beginners about Internet etiquette and navigation.

NASA had been scaling back its funding since the subsidy for the international link was first agreed on, and support for the New Zealand link ceased on 30 April 1994. However Waikato had been thinking ahead. Its controversial volume-charging approach had ensured there was sufficient cash in the coffers to cover increases in bandwidth and other contingencies. “It was a balancing act. We were constantly reducing the price through economies of scale but we had to track this so it eventually covered 100 percent of the costs,” said gateway manager John Houlker.

Later in 1994 Waikato’s international link to the West Coast of the United States, supplied by AT&T, had an embarrassing outage. Just as the first wave of Internet aficionados were demonstrating the commercial possibilities of Web connectivity at the Computerworld Expo in Auckland, the connection failed. A technician at the AT&T exchange in San Francisco had accidentally knocked out a plug. “It took them four hours to find the circuit designation and another two hours to fix it,” recalled Houlker. “I had been looking at putting in a second circuit but AT&T insisted theirs was super reliable and would be much cheaper than us investing with another provider.” However, another failure forced Waikato to take a second link into the NASA gateway from Sprint.

Waikato had already taken on far more than it bargained for and was looking to a time when the backbone would be entirely commercially managed. Increased use by businesses and individuals and the arrival of a handful of commercial ISPs suggested t was time to take note of the United States example and a rethink on who would take responsibility for future development of the New Zealand Internet backbone and administer the dot.nz country code and domain name registry (DNS). A series of meetings took place to make those decisions.

“NASA no longer needed us to keep the Internet in New Zealand going. It became a standard product they could pick and chose from 20 different vendors,” said Houlker.[20] In parallel it was decided that the University of Waikato should step away from running the international link. Frank March, John Houlker, and others proposed an independent democratic organisation to take over the domain name service.

Jim Higgins recalled one meeting of the Tuia Society early in 1994, when John Houlker turned up and announced he had some serious news: the previous Wednesday the number of registrations of dot.co.nz had exceeded the number of registrations of dot.ac.nz. “Waikato had been running an academic network and the thing had now got away from them. He was concerned about what was going to happen and it was then that the Kawaihiko group began seriously considering how to shift everything over to an Internet society. They didn’t want to be running a commercial Internet.”

Frank March drew a graph on the whiteboard which illustrated the point. If current trends continued the dot.co.nz commercial registrations were rising so steeply, compared with the other dot.nz registrations, that they would overtake all the rest by May 1994 and then completely dominate the dot.nz space. “That is exactly what happened and that was the basis for the Internet Society of New Zealand (ISOCNZ) being set up in the first place, not so much to run dot.nz but because it was very clear that the local Internet community was no longer going to be restricted to, or dominated by, the science and academic communities. Something was required other than Tuia so that the rest of the community could be involved in how the Internet in New Zealand developed.”

Don Hollander, at the time chairman of TUANZ, recalled a new body being discussed at a meeting at Victoria University. He believed running the Internet in New Zealand could easily fit in with his organisation’s mandate but that wasn’t to be. In November 1994 the Tuia Society advertised a public meeting at the National Library auditorium in Wellington to establish a new public body to manage Internet infrastructure development.

The CRIs no longer seemed interested in providing funding for Tuianet, leaving it with a much-reduced budget, mostly used for covering air fares, and an amount set aside to help fund a new body to take over its role. Uncertain of the way forward, Tuia had backed away from increasing its membership and from broader responsibilities, concentrating instead on maintaining its core sites until a new organisation could properly represent the interests of users. It was prepared to put up around $10,000 to help establish such a group, which was tentatively referred to as the New Zealand Internet Society.[21]

There was a cost involved in maintaining a DNS, and fees incurred for Internet number assignment, through the Asia-Pacific Network Information Centre (APNIC), which was itself depending on donations (US$10,000–$20,000). The funding situation could not continue indefinitely. Attempts by Waikato and Tuianet to charge for such activities had “provoked hails of protest from the Internet community.”[22] Running a new body would require resources, including the ability to advise users, although delegating that consultancy role to providers might reduce the load.

It was suggested the Internet Society of New Zealand (ISOCNZ) become a legal entity supported by membership fees. The new group would claim control of the dot.nz namespace. At the meeting, Colin Jackson, from the government’s IT Policy Unit, suggested a clause stating that common resources should be ‘uncapturable,’ a term that became a byword for all the group was to stand for.

The goals of ISOCNZ would be to “maintain and extend the availability of the Internet in New Zealand and its associated technologies and applications, both as an end in itself, and as means of enabling organisations, professionals and individuals to more effectively collaborate, co-operate, communicate and innovate in their respective fields of interest.”

It would start as an unincorporated society and become incorporated as soon as possible. A steering group was established and a further public meeting planned at the National Library in Wellington. The question of who should have responsibility for managing the international Internet connection remained. John Houlker had recommended at least two providers so ISPs and businesses could choose who they wanted to connect through.

“There was every reason to believe this technology was leaving the experimental research and development sector and IP was becoming a conventional telecommunications protocol. We knew the private sector would realize this could be good business and that the telcos might want to get involved. We also knew that a large telco with deep pockets could start offering a discounted service and bowl over our cashflow. We were on a fine balance and couldn’t survive any kind of competitive battle at all. Running the international gateway only worked when we were the only game in town, so the objective was to get out cleanly while ensuring the Internet community had a good service going forward,” said Houlker.

In 1995 the NSFnet backbone was replaced by commercial backbones across the United States. NASA continued to manage the federal gateways with a link into the private networks and the next-generation supercomputer networks. The arrangement with PACCOM community, which Waikato was party to, was expected to come to an end, with traffic moving across from NASA Ames to the proposed ‘MAE West’ gateway. However Mae’s invitation to “come up and see me sometime” was never issued. The gateway wasn’t ready so NASA, despite its public non-profit charter, agreed to continue running the New Zealand connection at full commercial rates.

Naming ISOCNZ

In 1988, after a spell as Unix evangelist with DEC Roger Hicks joined Auckland University as a part-time lecturer in information systems, with a few private consulting jobs on the side. He had seen how successfully the universities were using the Internet for email, FTP, and newsgroup access and worked up a ‘high-level’ business proposal which he took to Telecom. “I suggested if they got a lot of people interested in the Internet they could sell a lot more bandwidth. I took this six-page proposal to a couple of people I knew and tried to sell the idea of a joint venture but couldn’t get anyone to buy into it.”[23]

Clear Communications was just establishing itself in competition to Telecom, so he took his proposition along when applying for a job as IT strategist there. He got the job and claimed his proposal was influential in Clear eventually setting up its ClearNet ISP in 1996. At the time Hicks had stepped aside from his involvement as founding chairman of Uniforum but his involvement with Clear saw him step into another voluntary role.

He’d learned of the Tuia Society meeting at the National Library in Wellington and convinced Clear it should have a representative there just in case the Internet began to move into the mainstream. “I swanned around at Clear and got as many people to join as I could. It was like a write-in proxy; anyone who wanted could vote, so I turned up with a big handful of proxy votes and that’s how I got on the committee and put in a placeholder for Clear. Then everyone turned around and said I was just what they were looking for: someone who wasn’t at university or an academic. They selected me as chairman and at that point I couldn’t very well back down now, could I?”

The meeting was an important step in devolving power from the universities into a wider organisational structure that would determine the future of the infrastructure and the way the Internet would be run in New Zealand. Initially a day-long public seminar was planned to bring all interested parties together to formalise this group. In November 1995 the ISOCNZ was officially incorporated.

Ian K. G. Mitchell, the Computer Society’s nominee to ISOCNZ, had been working in the late 1980s and early 1990s with Wellington patent lawyer Peter Dengate Thrush, developing intellectual property database software for his law firm, Baldwin Son and Carey. He was elected to the ISOCNZ council, and when the issue of domain names emerged, Mitchell recommended Dengate Thrush, then an IP barrister, provide advice on how should proceed with the complex issues surrounding ownership and management of domain names.

Dengate Thrush was impressed with the proposal for a ‘first come, first served’ policy for allocating domain names. There was pressure, particularly from the trademark community, for ISOCNZ to set up some kind of examination system prior to registration, with rules preventing the registration of names registered by anyone other than those who held those trademarks or company names. However this could mean registration would take many months and involve the society in permanent dispute resolution.

He advised against ISOCNZ creating a formal dispute resolution service. He had seen how this operated in other registries around the world, particularly at Network Solutions Inc (NSI), the dot.com registry manager, and concluded was uneconomic and unnecessary. Instead, he recommended adopting a position where ISOCNZ would comply promptly with any court orders brought by aggrieved parties.

Dengate Thrush also advised forming a limited liability company to run the dot.nz registry business on a commercial basis, to be wholly owned by the not-for-profit society. The creation of the separate company, known as Domainz, would provide a legal and commercial firewall between the councillors and the operation of the registry. He recommended appointing Gavin Adlam, a local solicitor, to act for the society.

The initial intention of ISOCNZ was to join the international Internet Society (ISOC) but not long after its formation there was a phone call from the executive director of the international body complaining about the name the New Zealand body had chosen. As far as he was concerned it wasn’t an Internet society unless it was a member of ISOC. In 1996 when Roger Hicks turned up at an Asia-Pacific telecommunications conference in Hawaii he found himself walking a fine line between his role as chairman of the newly formed ISOCNZ, and being part of the strategic planning team at Clear Communications. He was confronted by Vint Cerf, was chairman of ISOC and a senior vice president of MCI Communications, one of Clear’s owners. “Our discussion raised a few eyebrows. He told me he wasn’t happy about the name and I said, ‘Well, tough shit. You haven’t got any say over the name we’ve chosen in New Zealand.’”

Picking up the pieces

Within a year of its formation, the management of the dot.nz name space was transferred to ISOCNZ, along with the management of about 2000 domain names in the dot.nz register. In September 1996 some real challenges arose. One was a major law suit against an opportunist, who had registered a range of high-profile company brand names. When Sanyo, Cadbury, Xerox, and Fuji began realising the potential of the Internet and sought to register their domains they realised they’d been trumped. They hired lawyer Clive Elliott to get those names back from the Domain Name Company Ltd (DNC) and in researching who was responsible for this travesty, the brand-new ISOCNZ found itself implicated. Initially it looked as if ISOCNZ would be a main defendant because it had allowed DNC to register cadbury.co.nz, which resulted in DNC then attempting to sell Cadbury its own name to for $10,000.

A letter from Clive Elliott and Alex McDonald of lawyers Baldwin Son and Carey to ISOCNZ lawyers Rudd Watts & Stone on 18 September 1996 made the statement of claim and application for an interim injunction, although neither ISOCNZ nor Domainz were named as parties in the proceedings. Fortunately the tide had turned somewhat and the big companies, while understanding the precarious position ISOCNZ was in, were keen to work things out. There was still a high risk, though, that it may need to call ISOCNZ as defendant.

The matter went to court on 27 September before the Honourable Justice Morris, who ordered that the defendants, the Domain Name Company[24] of New Zealand and David John-Paul Cameron Ward as second defendant, and Patrick Clement Robinson as third defendant, be restrained from directly or indirectly using the words Cadbury, Sanyo or Xerox or the related domain names or any similar name which was likely to “dilute the value of the plaintiffs’ trade name or trademark in connection with advertising, operation or maintenance of any Internet site in New Zealand.”

Justice Morris directed the defendants to take affirmative steps to either cancel the domain names or reassign them to the respective plaintiffs, and forthwith remove any link or referral notice whereby Internet users could access the first defendant web site by use of the domain names. The court directed the ISOCNZ to immediately transfer the domain names to the respective plaintiffs.

ISOCNZ was pleased with the outcome because its policy of not taking part in legal proceedings had been upheld. The legal community, however, was disappointed there had been no judgement which might have clarified the interaction between domain names and trademarks. That day in court was yet to come. Meanwhile the ISOC had been placing increasing pressure on ISOCNZ, questioning why it hadn’t joined as an affiliate member. The issue was raised again at an INET (International Networking) conference in 1996. “The reason was, in the chapter rules of ISOC it said, ‘You won’t make political moves without the approval of the ISOC board and you won’t take commercial or financial risks.’ Clearly if we were going with the DNS dot.nz we were going to be doing both of those things,” Roger Hicks said.

“We didn’t want to be restricted from making commercial statements about our marketplace or prevented from taking legal and financial risks. ISOC said other groups were doing all of these things already, so it didn’t really matter. ‘Just join in and we’ll all be happy campers.’ However the committee and council in New Zealand were of the view that if they signed something, that it wasn’t right to ignore what they were signing and say it didn’t matter. That’s why we never joined ISOC.”

There were long discussions about fees and the way these were paid to the US-based governing body, which put the local organisation on the back foot. “We had carved our own path from the beginning, even with our domain name rules, which I was responsible for setting up when Waikato University, handed over the domain system to us. We’ve always had an independent approach which resulted in a lot of accolades particularly when we decided we were a listing service and we would allow any name to be listed on a first come first served basis. Those principles were quite surprising for many people,” said Hicks.

The Internet in New Zealand had a new advocate watching its back, but the territory was still like the ancient wild west, and ISOCNZ had to work hard to establish its claim. The group was not timid or easily warned off by big international bodies who were themselves still trying to scope out how to deal with the inexorable march of the Internet. A strong watch was being kept on legal and regulatory issues, and the impact of the commercial market as ISPs became an important part of the growing Internet community.

Footnotes

[1] Interview with Keith Newman

[2] It never happens

[3] Speech to TUANZ by SOE Minister Richard Prebble, 7 August 1990

[4] Gordon Campbell, ‘Tough Calls,’ NZ Listener, May 17–23, 1997

[5] Williamson says he has all the media clippings and documents if anyone challenges his take on what was happening

[6] Network World, 12-06-89, Networked PABXs in high-speed link

[7] The Commerce Commission withdrew its long running action against Telecom’s Megaplan pricing in March 1995. It concluded the time and costs involved in pursuing this matter to finality would be greater than any benefits that might result. It was no surprise that seven years after the complaint was made, the telecommunications environment had changed so much that no important precedent was likely to be established. Commerce Commission Press Release 31 March 1995

[8] Keith Newman, Satellite link offers glimpse of future, Network World, May 29, 1989

[9] Frank Bajak, The Press, Tuesday, 1 June 1993

[10] The shoot or branch of a creeping plant or the handles on the kete of knowledge. Hiko = random or distant flashing, lightning or something beginning to shine

[11] VUW router’s change log says the Kawaihiko DDS links were operating from on 14 Jun 1990. Comments and clarifications by Don Stokes on NZnog newsgroups

[12] Donald Neal, ‘The Harvest Object Cache in New Zealand,’ Information & Technology Services, University of Waikato

[13] DSIR Act 1974 (section 5a)

[14] Doug Edmeades MSc (Hons), PhD, DipManag, MNZSFM, New Zealand Science Review Vol 61 (3–4) 2004. http://nzas.rsnz.org/publish/archive/NZSR_61_3_4.pdf

[15] Comments and clarifications by Don Stokes on NZnog newsgroups

[16] MUDs (multiuser dungeon, domain or dimension) are a multi-player computer game that combines elements of role-playing games, hack and slash style computer games and social chat rooms. Typically running on an Internet server or bulletin board system, the game is usually text-driven, where players read descriptions of rooms, objects, events, other characters, and computer-controlled creatures or non-player characters (NPCs) in a virtual world. Source: http://en.wikipedia.org/wiki/MUD

[17] The Simtel archive was first made available on the public Internet in 1993 after its original host on AARPANET was shut down. It was a colossal archive of shareware for various operating systems, particularly Microsoft Windows and MS-Dos

[18] Project Gutenberg was founded by University of Illinois student Michael Hart in 1971. It is a volunteer effort to digitise, archive, and distribute cultural works. Most of the items in its collection are the full texts of public domain books

[19] The PACCOM consortium, partly funded the original connection of Internet sites in New Zealand to the rest of the Internet in the United States, via a gateway located at the University of Waikato

[20] John Houlker interview with Keith Newman, New Zealand Herald, July 1998

[21] From the minutes of the Tuia Society meeting November 1994

[22] John Houlker as quoted in the minutes of the meeting

[23] Hicks insists one of those people was instrumental in setting up Telecom’s Xtra ISP seven years later

[24] In Qantas Airways Limited v The Domain Name Company Limited (2000) 1 NZECC 70-005, the defendant registered the domain name qantas.co.nz. It then attempted to sell the name to Qantas. The High Court was quick to condemn this and ordered the defendant to de-register the name. The Court found that such actions were a deliberate blocking of the lawful exploitation of the name and a fraudulent use of Qantas’s goodwill Dominion Breweries was also successful in its Court action to get the domain name db.co.nz back from the Domain Name Company in DB Breweries Ltd v The Domain Name Company Limited (2000) 1 NZECC 70-009