Intelligent arguments

Over the last few years, satellite system designers and operators have engaged in an increasingly heated dialogue. The topic? On-board processing and switching for broadband satellites.

The debaters break neatly into two camps. In one corner: the “yes, on-board switching is the future” camp; and in the other corner: the “no, it’s a bunch of over-programmed hooey” camp.

Naturally, the two sides have much to say.

The concept of on-board switching is not entirely novel. For 10 years, the U.S. government has employed on-board switching for narrowband applications on its ACTS and Milstar satellite programs. Mind you, these are narrowband programs, not broadband.

On paper and in limited practice, onboard switching appears a mighty challenger to ground-based switching and processing — particularly for broadband satellite networks. Most Ka-band systems will feature first generation on-board switching.

But drop the phrase “on-board switching and processing” in casual conversation among operators and take cover. Passions run high.

If you elect to put the smarts on a satellite and launch it and something goes wrong, some critical component takes a dirt nap, what then?

In a traditional bent pipe, transponded satellite network, all the smarts are ground-based.

The complexity of on-board switching is daunting because those components responsible for routing and processing digital data chunks will be far, far above the Earth in a LEO (500-1,000 kilometers) or GEO (36,000 kilometers) orbit, depending on the network architecture. if something fails catastrophically with the on-board components, it will be a long journey, even for the stalwart engineer. To avoid funky Stanley Kubrick-type scenarios that include orbiting engineers with tool boxes, most proposed Ka-band satellite systems have plans for full redundancy, just in case.

On-board processing makes the best of full-mesh connectivity, single-satellite hops, bandwidth on demand, frequency reuse.

But again, bent pipe technology can achieve almost all of the same things.

The goal of on-board processing: exploit resources. Make the best of spot beams and beam switching and do it with a splash of bravado.

The problem, of course, is how fancy do satellites need to be? And if on-board switching is so deliciously splendid, as the proponents claim, why isn’t it accepted as gospel?

It’s not just the distance between an engineer and a payload that’s worrisome. It’s about money. Money. Money. Money.

Why pay Ferrari prices when a Volkswagen will do?

Adding on-board processing is expensive, and the more complex and sophisticated the design, the higher the price. Proponents of on-board switching claim that this higher price is recouped through the higher throughput. The cost per bit of information goes down, and operators can haul more “billable” bits with smarter payloads.

The bent pipers are far less bullish, naturally, and proselytize that the risks of onboard processing far outweigh the advantages. “If it’s not broken, don’t fix it,” is the battle cry.

In the grand scheme, there’s probably room for both designs. So, bent pipers and on-board switchers, put the gloves down. You’re both right.

Just remember the maxim: Don’t let technology exceed application.

On-board processing may be risky. But it may work.

Bottom line: Quit arguing theory. Build the fancy satellites, if operators are willing to ante up, let them. We could use the bandwidth anyway.

Illuminating The Invisible Broadband

Pushing fast access to the edge of the Net with satellite

To Internet service providers (ISPs) and infrastructure players, three consumer and enterprise markets are practically invisible.

Areas outside the major urban markets where high-bandwidth terrestrial lines are impractical or expensive; international markets seeking backbone access; and existing access providers seeking additional bandwidth without major investment timelines or costs go virtually unnoticed.

Illuminating these markets isn’t a matter of adding more ground lines. It’s a matter of using satellite-based systems with fast protocols and smart caching systems that push content and access to the edge of the Internet faster. Satellite-backed networks can be deployed more rapidly than terrestrial lines at a fraction of the cost.

It’s called the invisible broadband.

Connecting earth and sky

Working and living within a well-wired major U.S. market makes it easy to forget about areas that don’t have high-speed Internet access. Many ISPs serving such areas struggle to provide adequate access to customers.

India, a market with more than one billion people and a burgeoning technology sector, will be waiting for years to gala access to high-capacity terrestrial services. Even in the United States, access providers in major markets struggle to balance demand, costs and long construction time for extra capacity.

Building T1, D53 or larger connections for these markets, however, is not necessarily the answer. The most important factor is often decentralizing content. The goal is to push content as close to the ultimate user as possible — the edge of the Net.

The solution to this conundrum of fast access without terrestrial comes with the combined power of next-generation satellite transmission systems and community caching networks.

One of the biggest technical challenges with satellite systems is that of latency: delays in network response because of the distance satellite-based systems must transmit data, compounded by the milliseconds required to translate, route and compress data. Satellite-based systems using TCP/IP protocols suffer from inherent latency problems because of a combination of physics and the Internet standardization to TCP/IP. TCP/IP-based satellite systems introduce latency primarily because of inefficient methods to compress data — also known as packets–which form larger messages or files. Under these conditions, performance plummets with delays that can exceed 700 milliseconds, including round-trip signal transmission and packet delays.

Forward error correction codes add data redundancy to the transmitted data bit stream. The added redundancy allows the receiving system to identify and correct transmission errors. The data redundancy adds spectral bandwidth to the signal but allows reduction of power levels and often transmitter costs. In extreme cases, the transmitted power level can be reduced by as much as 90 percent at the expense of using more than twice the satellite bandwidth required by an uncoded signal. Use of forward error correction dramatically reduces secondary transmission requests and often contributes to lower hardware costs.

The combination of enhanced protocols, data compression and forward error correction lead to significant improvements in the encapsulation and un-encapsulation process, which determines the manner in which data is transmitted and received. Yet this combination is just part of the satellite-based “invisible” broadband solution, which differs from “visible” broadband of land lines.

The final frontier

A vital component to building invisible broadband and pushing fast access to the edge of the Net is dynamic, n-tier caching. Even with fast VSAT (Very Small Aperture Terminal)-based satellite transmissions, the greatest efficiency of invisible broadband comes from placing caching systems at each downlink site. Caching systems are a form of high-speed, temporary storage for data that needs to be rapidly available.

The true power of such caching systems comes from the ways they are grouped and regionally managed. Local caching systems connect both to the network’s core parent cache systems and to each other. In a given area, several local caching systems will share and dynamically balance the most popular content to serve a region faster and more efficiently as an n-tier network. An n-tier network is a generic description the new-generation of client-server systems that have many more layers or clusters of networks than the two-tier client-server model. N-tier networks more closely resemble the way the Internet really works – a network of networks and add greater capacity to distribute information to a wide number of users.

A combination of prioritization protocols and pre-emptive caching algorithms determine storage of popular content such as Web pages or video files. This n-tier, dynamically balanced system forms a community caching network (CCN).

The general premise of VSAT-based caching networks is that data is easier to push to the perimeter of the Internet than to manage high-volume requests that distort bandwidth requirements. To this end, pre-emptive caching maximizes low usage time periods to replicate potentially entire Web sites or networks.

Satellites connected to dynamic caching systems benefit regional groups of ISPs as well as individual ISPs. When a request for a given piece of Web content is made, the reply from the parent cache is multicast to the entire caching community at once. Multicasting ensures that cumbersome redundant requests are reduced.

For individual users, the architectural upgrades translate to faster access, but commercial-caliber clients benefit as well. For firms concerned about security and dedicated bandwidth, the invisible broadband model forms a virtual private network (VPN), which spares concerns about exposing data and mission-critical applications to the public Internet. These same VPNs can be extended to completely private networks for dedicated applications.

The above scenarios and example are most appropriate for areas outside major urban markets or international ISPs seeking a gateway to the Internet backbone. Yet one of the greatest opportunities for invisible broadband is to augment the capabilities of ISPs in major markets seeking to improve the speed and diversity of services without the major investments required for additional land lines.

Savvy ISPs and content aggregation

In a business where success depends on maximizing efficiency and volume of traffic on razor-thin margins, ISPs in major markets must constantly seek opportunities to cut costs while maintaining excellent service. But frequently, the only option to expand bandwidth is the addition of DS3 or other high-capacity lines, which often take too much time to build and fail to lower operational costs.

The VSAT-based satellite and caching structure provides the optimum solution to this problem by offering a way to build an overlay network with a quick set-up in a matter of weeks and much lower operating costs. Yet, the key to success isn’t just about fast access, but savvy content aggregation. ISPs in major markets can deliver the best of the Net to customers without delays by prioritizing the most popular content.

In addition to the CCN, there are several forms of content aggregation and distribution that enable the multi-tiered invisible broadband network to operate smoothly. Pushing Internet services close to the end users is necessary to reduce non-customer traffic over a satellite hop.

First, DNS (Domain Name Servers)-to-IP conversions are cached locally, which saves time that domain name servers usually need to convert names to numbers. Second, ISP customers can set up their own email structure under POP3 (Post Office Protocol) and SMTP (Simple Mail Transport Protocol) to include virus protection and distribution that put remote servers closer to end users.

Third, content aggregation operates in both directions, bringing the most valuable content to the edge of the Net, but also hosting remote applications, such e-commerce servers, in target markets. This type of Web accelerator is similar to a proxy server, but in reverse — bringing content and applications closer to the targeted audience.

Fourth, security is a very important aggregation service for schools or municipalities. To protect customers from hackers who take advantage of remote networks, satellite-based broadband networks provide additional firewall support.

In concert, these services augment an ISP’s offerings while providing a buffer zone of bandwidth for usage spikes driven by streaming media, voice over IP (VoIP) and other bandwidth-intensive applications. With dynamic load balancing of service and data types, ISPs gain bandwidth control and monitoring.

Over the horizon

As ISPs migrate from a purely terrestrial-based infrastructure model to a hybrid approach with satellites and caching, future services loom as opportunities to expand offerings to customers.

Distribution support for ISPs serving their customers as content aggregators means that multiple POPs can receive content and applications with a single satellite transmission. This reliable and cost-effective management technique ensures that customers always have the fastest access to information.

Replication services that re-distribute information among multiple users will equip providers of bandwidth-intensive applications with more reliable delivery to end-users. By transferring bandwidth-intensive applications to multiple locations, reliability of service increases with minimal concurrent bandwidth requirements.

The Internet access marketplace has tremendous growth opportunities. But the need for cost-effective, fast access requires a new model, especially for areas outside major markets, outside and inside major U.S. markets where low-cost bandwidth augmentation is imperative. VSAT-based satellite systems with high-speed protocols and savvy caching systems bring all the elements of success together in a single, time-tested package.

The changing faces of Internet access

The original concept for the Internet service provider was relatively simple in concept — one or two T1 connections to the Internet backbone, a large number of telephone modem banks, and subscribers dialing in on 19.2 to 56 kbps telephone lines for Internet connectivity. While 56 kbps connects are sufficient for basic static Web page access, market demands and service offerings are becoming much more complicated as subscribers learn about broadband, expect to listen to their radio broadcasts, use video streaming, buy software or download large files off the Internet.

Now ISPs need to consider what their role is with respect to cable, DSL and satellite. ISPs will not be able to afford a brute force approach of just adding more backbone capacity. Satellite, especially as an overlay distribution media, is a key technology that demands consideration. It provides instant infrastructure with considerable bandwidth and is especially useful for streaming and multicasting content.

On a global basis, the Internet service provisioning business can be generalized into two basic models. In the more mature markets of North America and Europe, ISPs are facing the challenges of consolidation, market demands for service differentiation, and subscriber acquisition and retention. These ISPs have been in business for several years and want, not only to remain profitable, but also to move forward in market share and revenue growth.

The second model covers the rest of the world (ROW), where ISPs’ foremost concerns are providing basic connectivity and establishing services on the frontiers of Internet access. These ISPs are literally establishing the edge of the net over wide geographical areas of Asia, Latin America, and Africa.

The trend for Internet access across Asia is individuals sharing ISP accounts at their local cyber cafe — typically four to five users per account. As the demand for Internet access spreads across Asia, India, and other infrastructure poor countries, kiosks and cafes will be spreading like wildfire.

Some recent market studies show that satellite will overtake DSL in some markets as the access technology of choice. And this seems very plausible if the terminal economics change in the near future as they seem to be indicating.

BroadLogic’s ( chief executive officer, Toby Farrand, recently stated, “We truly believe that satellite will surpass DSL in market penetration for Internet access in certain markets. BroadLogic is going to radically change the economics of the terminal equipment. It used to be that a 2-way satellite terminal capable was $5,000. We fully expect this price to drop below $1,000 in large OEM quantities and probably much less than $1,000. This, along with the new spot beam and Ka-band satellite technologies is significantly changing the economic model right under our (collective) feet.”

And BroadLogic is a key player in providing receiver technology to Gilat, who is in the midst of a 20,000 subscriber U.S.-based beta trial for its Gilat-to-Home service ( It will take a number of months for them to roll out the service and make adjustments. Once MSN, Echostar and Gilat process the lessons learned it will be available instantly for consumers to purchase through every Radio Shack (Tandy) store, every Dish (Echostar) distribution channel and to existing MSN subscribers. There’s no need to wait for streets to be torn up. National service rollout is expected at the end of 2000. Unofficial pricing places the monthly service at $60 and equipment cost below $400, which is competitive when compared with the $40-$50 monthly fee for cable and DSL access.

As bandwidth hungry content and new applications become more in demand, the challenges facing ISPs grow — especially those located in the more mature markets. According to a recently released report on Satellite Mediacasting from Pioneering Consulting (, bandwidth requirements to support this new demand will rise exponentially over the next five years.

While it may not seem apparent, ISPs in the rest-of-world market (ROW) have the advantage.

The high-tech cool war

Unfortunately for U.S. industry, many of the predictions made in that article have now come true: In a nutshell, U.S. satellite manufacturers have lost orders to Europe and European manufacturers have lost confidence in U.S. suppliers.

The evidence: Last year Hughes was forced to default on a $450 million contract with Asia Pacific Mobile Telecommunications and Space Systems/Loral was obliged to suspend work on Chinasat-8. Meanwhile, the German space company DASA notified its project managers that they should seek non-U.S. suppliers wherever possible. This year, following the Canadian government’s decision to ditch a potential U.S. supplier, Alenia Aerospazio of Italy won a contract to build the platform for the Radarsat-2 spacecraft. Alenia too has reduced its use of U.S. components and could be forced to abandon them entirely.

U.S. government officials have agreed, in principle, to treat some of their allies, such as Canada and some European nations, as special cases. Canada, for example, has been accorded an exemption under the U.S. International Traffic in Arms Regulations (ITAR) accord, presumably because America does not see Canada as a military threat.

Well, it’s very nice of America to consider Europe a special case under ITAR, but it misses the point. Commercial communications satellites and their components are not arms and should never have been included in the export regulations in the first place.

Apart from that, Europe and the rest of the world do not ultimately need America to supply its satellites. Yes, I know America’s leading satellite manufacturer, Hughes, has sold more than any other manufacturer, but “sold” is past tense. Let’s consider the future.

Following the recent formation of Astrium, a corporate amalgam of French, British and German space companies, Europe is home to two major satellite prime contractors (the other being Alcatel Space) and the world’s leading launch provider, Arianespace. In a sense, Europe is now a one-stop shop for satellite systems, a monopoly position once held by the United States.

Europe also has one of the largest satellite operators, Eutelsat, which plans to launch seven new satellites in the next two years as part of a $1 billion upgrade program. It is possibly not surprising that Eutelsat’s three generations of satellites have been built in Europe, but Astrium’s recent contracts for the Intelsat NI-Alpha spacecraft and Imnarsat’s fourth generation show that international organizations are also turning to Europe for their future orbital infrastructure.

A major argument against the transfer of licensing authority to the State department, and the reason Europe’s manufacturers are picking up the extra satellite contracts, is the time it takes to get an export license. The State department’s sensitivity to the issue was illustrated in May, when it closed its export licensing office for two weeks while it relocated. No written notice of closure was issued. Presumably no account was taken of any disruption that might be caused. I would guess that the words “to serve” do not feature in its motto.

Another argument against the current bureaucracy is that it hinders launch failure investigations, an issue of crucial importance to two international joint ventures involving American companies – International Launch Services (ILS) and Sea Launch. Both systems have suffered failures, of the Russian Proton and Ukrainian Zenit respectively, in the last year. The constriction of information flow enforced by ITAR rules extended the investigations, reduced the operators’ capability to serve the market and placed their businesses at unnecessary risk.

According to reports, at one point during the failed investigation, foreign nationals stationed at Sea Launch’s Long Beach home port were denied access to their own technology. Ultimately, this type of mentality and the lack of communication that ensues leads to technical failure — and this is one vicious cycle the launch industry cannot afford to enter.

All three of America’s leading satellite manufacturers have been accused by the government of compromising national security by releasing satellite and launch vehicle information to foreign nations, particularly China. This has been an issue for more than a year now, yet no manufacturer has been charged with a crime, let alone convicted. Perhaps the ultimate event occurred in April when, during a review of NASA’s Next Generation Space Telescope (hardly a national security concern one would think), a number of researchers were asked to leave the room because a Lockheed Martin presentation contained information controlled under ITAR. Strangely, one was a U.S. citizen (but employed by a German science institute); another was a French scientist with permanent U.S. residency status. Is U.S. industry running scared or what?

Indeed, the fear of Big Brother has spread to the ivory towers of U.S. universities, such as Stanford and MIT, which now avoid hiring non-U.S. citizens, dreading the consequences of the ITAR regime. It’s lucky this is not the McCarthy era because they’d be rooting out anyone with experience of a European vacation.

America is often accused of being parochial and it does not need to exacerbate this image. Apart from that, I seem to remember that Americans are proud of their rights for freedom of speech and expression? Is this not the cornerstone of a democracy?

Word is, America has an election coming up. One wonders whether industry supporters will be brave enough to make this an issue. Or will they quietly withdraw behind the ITAR curtain?

Satellite’s Golden Egg

A golden goose is waltzing around the heads of the satellite industry and its about to lay a golden egg, a Ka-band egg that will be ultimately worth billions of dollars.

It’s no pipe dream.

Neither is it a dream that will become a reality overnight. It’s going to take months and years for satellite broadband to capture the 15-30 percent market share that is being projected.

So what do we do while we await this windfall.

Two things come to mind. And by doing one, we do the other.

The first is to learn as much as we can about providing services directly to the commercial and consumer mass markets. (The telephone and cable companies obviously have a huge head start, but they also carry a lot of heavy baggage.)

The second is to use current satellite technology and broadcast bandwidth right now to offer on a smaller scale the kind of value-added services we’ll be offering when the first generation Ka-band satellites are finally deployed, especially into the commercial market, which will be the early adopter.

A firm foundation

Satellite starts off on a very firm foundation.

VSATs have performed well in the commercial communications niches it occupies. They have a track record.

Satellites are used to deliver signals to terrestrial television transmitters, studios, cable head ends, and from news and sporting events back to collection or editing points. They’ve proven they’re reliable.

Broadcasters are using satellites to deliver entertainment across vast geographical areas (and have done so for many years). In fact, one could argue that satellite broadcasting has become ubiquitous. So it’s not afraid of a little competition, and can effectively penetrate and service a basic mass market.

The digital advantage

Satellite broadcasters were among the first to embrace digitization when the large potential cost savings offered by compression became apparent. Those versed in satellite broadcasting know how to get ahead of the technology curve.

And now the Internet, coupled with IP, DVB and other open global standards, has made it even easier to compress content, and to offer cost-effective solutions for the delivery of data by satellite.

In fact, with the low cost, high volume chip sets developed for the mass market, communication at high speed over Kuband satellites, at an acceptable price point, is already possible. Now what is needed is for broadcasters, ISPs or others with vision to step forward and offer audio, video and data services via satellite to a customer base that is more than willing and able to pay for them.

Getting up and running is surprisingly easy. With an existing digital uplink and a minimum of extra hardware simply take the data from a data transmission source and format it into an ASI data stream. This can then be injected into a video multiplexer and transmitted over satellite. If the multiplexer is a statistical multiplexer, a few Mbps can easily be opened up for the data services. Where the uplink has an older mux without statistical multiplexing, the bandwidth of several video channels can be reduced slightly to free up the bandwidth for the data, without impacting video quality.

All that is required for data reception is an appropriate antenna with LNBF, a run of cable and an indoor unit (IDU), which would typically consist of a satellite receiver with an Ethernet or USB interface for connecting to a PC or network, or a receiver card that is mounted inside the receiving computer. If interactive (full two-way communications) services are contemplated, a satellite interactive terminal (SIT) with appropriate transmitters and modems would be required, but that’s another money-making idea for another time.

Data can also be delivered into the Internet backbone for reception via a standard Internet connection.

Services that can easily be offered

Having squeezed out the available Mbps, the next question is, what are they good for?

While most proposed broadband services are expected to be interactive, one-way broadcast-multicast offers some potentially profitable opportunities, such as the following.

Audio distribution: Every month production companies produce background music and other content for corporate customers to play back at hundreds of stores and businesses. Currently, the content is pressed on CDs and distributed via Federal Express, or it is transmitted via the Internet or other channels. Transmission via IP over MPEG2-DVB can reduce costs for the end user and/or increase quality.

Live streaming of audio content (web events): By using a satellite uplink with multicast to multiple points on the Internet backbone, or to a data reception terminal as described above, live audio content can be distributed, avoiding net congestion and reducing costs for the promoter.

Streaming video: Live video events such as stockholder meetings, corporate briefings, conference calls, etc. are already being carried over MPEG2. However, the infrastructure costs are too high for most corporations. A less costly approach is to send the video feed via the Internet directly to an uplink site and then broadcast (MPEG1 or 2 video over IP over MPEG2 transport stream) for direct reception at viewer sites.

And then there are the less glamorous possibilities:

Multicast data: Satellites still cannot be beaten for applications where large amounts of data must be sent to large numbers of remote sites. A good example is the distribution of inventory and pricing databases to point of sale computers. A single file of data can be broadcast to all computers in the network. If there are regional differences in pricing or product lists, the data can be directed to a group of specific computers or terminals. In a network, a terminal — which can include anything from point of sale cash registers to handheld scanners to portable or fixed MP3 players to PCs – can be a member of multiple multicast groups as well as having an individual address. Thus the MPEG2-DVB infrastructure can be utilized to broadcast new pricing information to all stores, while uploading a new operating system to a backup computing facility in the Midwest.

Webcasting: In many regions of the world, access to the main Internet backbone based in the United States is poor. By setting up a webcasting system where popular content is broadcast and stored on local caches, users in distant regions can be given faster response time to popular content (since their request will be in the local cache vs. going through a slow connection to the backbone).

Potential customers

Who is going to buy all these wonderful services?

In many organizations the MIS department is also responsible for the telecommunications facilities used in the corporation. Unless their business is telecommunications, most MIS departments are woefully understaffed, and have a great deal of difficulty finding fully qualified staff. In most cases, they don’t go looking for solutions that will add to their workload, and they usually don’t replace what is already working unless there is a compelling case to do so.

So a simple phone call won’t cut it.

The key is to present a full solution, and to bring the necessary partners who can install and service the technology. Then hammer home your story of cutting edge-technology coupled with cost-savings. A simple comparison of the cost of various types of telecom facilities to service a variable number of remote locations is quite compelling. The diagram shows a rough comparison of network cost for frame relay, ADSL Internet, ISDN dial up Internet and satellite transponders.

It is clear that with satellite, the recurring costs of the communications facilities (the satellite) are fixed — no matter how many remote terminals are receiving the signals. Telecommunications costs are being reduced all the time, as is access costs for Internet. However, with satellite, not only can costs be lower, but the throughputs achieved can be higher since Internet congestion will not be a factor.

What’s this bandwidth worth?

If the satellite industry is to thrive as a broadband provider, gain that 15-30 percent of the billions that will be spent, it needs to learn as much as possible about providing services directly to the commercial and consumer mass markets. Pricing is an element of that lesson.

Marisat F-2 continues to serve in the scientific community

It doesn’t have a nickname like old faithful, old goat or crusty the satellite. The controllers and those involved with it on a daily basis call it plain old Marisat F-2.

Maybe that’s because Comsat’s Marisat F-2 doesn’t act its age or that the optimism of its prolonged youth keeps it trekking around the planet.

Either way, Marisat F-2, the oldest commercial communications satellite in service at 24-years-old, continues to serve a purpose even today. The U.S. Navy Space and Naval Warfare Systems Center has signed an agreement with Comsat to use Marisat F-2 on behalf of U.S.-based National Science Foundation (NSF), keeping the satellite functional for at least another year.

The satellite, which was initially launched into geosynchronous orbit in 1976 from Cape Canaveral, Fla., will be used to transfer research data gathered by scientists at the Amundsen-Scott Station in the South Pole back to the United States. The NSF is installing a large, 9-meter antenna in the South Pole to transfer the data to the satellite. The antenna will replace the existing 3-meter antenna.

Comsat will receive data via its gateway facilities located at its teleport in Clarksburg, Md., with the use of a new, very powerful antenna of its own.

Old man satellite

The agreement seems odd considering the age of Marisat F-2. Jimmy Carter was elected president of the United States, Apple Computers was founded, Chinese Premier Zhou Enlai died of cancer and North and South Vietnam were united in 1976, the year Marisat F-2 was first launched.

Since, Marisat F-2 has been estimated to have made 8,500 trips around the world, which is more than 2.2 billion kilometers.

The contract between NSF and Comsat is for just one year with options to continue through 2005, but that has nothing to do with the mileage of Marisat F-2, according to Dan Swearingen. Swearingen has been with Comsat for the entire life of Marisat F-2 and is now the vice president for advanced engineering and planning.

“It’s to minimize hassle with government contracting,” Swearingen says. “Multiyear contracts create a lot of hassle on their part.”

Marisat F-2 has a few age-induced blemishes and has already outlived the expected five-year design life, but the people at Comsat aren’t worried about failure.

“It’s had all of its capacity operational throughout the lifetime,” says Swearingen. “It’s lost a little redundancy and being so old its lost its solar cells and its station keeping fuel are a bit depleted. But it’s still got plenty of life left.”

And there’s plenty of reason for Comsat’ s confidence in the satellite, as it has encountered few technical problems in its 20-plus years of existence. The most major situation it has suffered involved the uninstructed use of fuel, according to Swearingen. In the end, the problem turned out to be beneficial to Inmarsat (Comsat is a signatory of Inmarsat) by reducing the inclination of the satellite. Otherwise, the history of Marisat F-2 is virtually untarnished with inadequacies or failures.

That doesn’t mean NSF hasn’t taken notice of its age. The foundation is very experienced in using older satellites and didn’t overlook that Marisat F-2 is nearly 20 years older than its expected lifespan.

“Yes, we are,” says Patrick Smith, technology development manager in the NSF’s Office of Polar Programs, when asked if NSF is at least a little worried about the satellite’s age. “So, we have a kind of orbit sparing philosophy. We have more than one, all with similar capability. If you lose one you can still provide the same kind of service with the same kind of quality levels but what happens is you lose a little contact time per day.”

But NSF had few choices that could provide the kind of service necessary and Marisat F-2 proved to be the best option.

“When you’re into this business right now and you know how you can get this kind of capability, the realization is, based on some surveys that we have done, the only way we can get an inclined satellite like this is one that is beyond its published lifetime,” says Smith. “That’s because most operators will not disable north-south station keeping until sometime into the space craft mission with the motivation to preserve life, if you were to start at year zero following the launch and let it drift, it would probably take around nine to 10 years to drift to the inclination.

“The geosynchronous Comsat folks have always been good about debris management, meaning when you hit a certain state in spacecraft life, it’s disposed. When they dispose them, we can’t use them.”

In the name of science

The NSF’s purpose in the South Pole is to gather scientific data in various subjects and transfer it to the United States.

“The data that’s flowing would come mainly from our Astronomy and Astrophysics programs,” says Smith. “They have a 1k by 1k for an infrared telescope for sky imaging. We have a group that’s using the ice cap as a neutrino detector. They have basically a prototype for a large cubic kilometer size array that they would like to build to form a telescope that would detect neutrinos.”

Other projects in the South Pole that need data to be transferred involve biology and medicine, geology and geophysics, ocean and climate and glaciology.

The hope is that Marisat F-2 will help improve the quality of communications and life for the scientists and staff working in the South Pole with a hefty transfer rate of 2 Mbps or more from the South Pole and 1.5 or more from the U.S.

Marisat F-2 is one in a very small handful of satellites capable of serving the NSF’s needs.

“The National Science Foundation would like it to have eternal life,” says Swearingen. “They value these inclined orbit satellites. The mission they’re trying to take care of is the South Pole. Ordinary geostationary satellites can’t be seen from the South Pole. They had their eyes open for inclined orbit geosynchronous satellites and non-geostationary, but they were looking for satellites that had enough bandwidth to carry at least a megabit per second.”

Marisat F-2 is smaller than most of today’s satellites with a weight of 680.389 kilograms, a height of 3.81 meters and a diameter of 2.03 meters. It is one of three Marisat satellites, but Marisat F-2 is the only one still serving some purpose.

Marisat F-2 will be visible to scientists for five to six hours per day and has the equivalent of 48 telephone circuits, which is capable of 28 million bits per second. It will deliver NSF’s data on an L-band. NSF is using four other military satellites – TDRS F-1, Goes 3, Les 9 and ATS 3 — along with Marisat F-2 to provide numerous services to the South Pole Station on a more continuous basis.

“We have a sort of junk box collection of spacecraft we’re using,” says Smith. “All were launched a very long time ago. Another satellite we’re using for a capability similar to Marisat is the old weather satellite, Goes 3. You put the two together and we’ll get about 11 hours a day of continuous satellite coverage.”

The ATS 3 is the oldest noncommercial communications satellite in use. It was launched in 1967. nine years prior to Marisat F-2.

At the moment, TDRS F-1 is enabling the majority of data transfers as it can provide an astounding transfer rate of 5 Mbps. The problem is that TDRS F-1 is nearing the end of its life. This means NSF needs another satellite with decent transfer rates to replace it. Enter Marisat F-2.

“What we’re going to use Marisat for in addition to general Internet service is we’ll also be using it as a backup, high-speed data relay,” says Smith. “I think it will meet our current need, but it doesn’t have a lot of long-term growth capacity.

“But it does an important thing for us in that it provides backup for the NASA [TDRS] satellite. The NASA satellite is in very bad condition and NASA is worried about losing control of it. Marisat will give us basically a backup and depending on how NASA feels about the health of the satellite a replacement for TDRS F-1.”

Still, there are plenty of valuable uses for Marisat F-2 while TDRS F-1 is still functional. Marisat F-2 will offer Internet service and services that will improve the quality of life for the scientists.

The link will be asymetric with the outbound direction from the South Pole having a bigger pipe than the inbound direction from the U.S. to the Pole.

“For Marisat in particular, the big service we are trying to provide is Internet for all intents and purposes,” says Smith. “This convergence helps us out because we’ve already made a commitment to have Internet to begin with and collapse all of the skills into it.

“We’re going to be using it for a lot of things, including sending science data, but also people will be using it for morale purposes. The people out at the South Pole Station, it’s not like they can go off to another phone company and get their own private Internet or phone service. We have to do it all and with a reasonable amount of quality. So, we have to have enough head room to support that.”

The satellite still has to be tested by the government once it reaches a station in August.

Finding of the Holy Grail

The reason for Marisat F-2’s nearly immortal life span isn’t an unpublished Comsat secret. It’s a matter of basic janitorial-like work.

“The satellite has been well taken care of for one thing,” says Swearingen. “The other is that it’s been parked in front of the orbit, which is where the East-West station keeping fuel requirements are not so large. Twice a year you have an eclipse season, and the batteries where you carry the electric load are reconditioned shortly before the season.”

Dennis Boiter, Comsat’s manager of orbital dynamics, adds, “We’re very conservative in what we do with the satellite and we never spend fuel that we don’t have to.”

Finding the megatrends

Staying ahead of the competition is usually the most important thing a company, including a broadband satellite enterprise, can do. Then. Now. And forever. Period. Within that parameter, staying on top of killer applications and killer trends is absolutely critical.

In Washington, D.C., a new company called Third Millennium Forums (3MF), was formed for the purpose of addressing this need, titled along the lines of “Finding the Megatrends and Killer Apps.” Comprised of a faculty of several top analysts within their respective fields, 3MF offers key companies and individuals answers to their questions in the telecommunications, media and computer sectors.

Specific areas of expertise include convergence; the wired, wireless and satellite industries; Advanced Interactive Multimedia (AIM) services; and Internet services and technologies. Specialized emphasis is on technologies such as High Definition TV (HDTV), digital TV, Internet Protocol (IP) telephony, Big and Little Low Earth Orbiting satellites (LEOs), Personal Communications Services (PCS), Local Multipoint Distribution Services (LMDS), Multichannel Multipoint Distribution Services (MMDS), cellular, Digital Subscriber Line (DSL) and streaming media.

Recent 3MF meetings have focused on providing information to a core of satellite executives, such as service providers, network operators, launch services providers, spacecraft manufacturers, end-users, financiers, lawyers and others close to these disciplines.

In this context, trends (and some key developing questions) that have repeatedly arisen are as follows:

Loosely termed, 3MF labels these trends “The Top Five Megatrends.”

Choice: Led by the revolution in video telecommunications, which itself was led by DirecTV’s foray into the efficient utilization of digital video compression, choices of video, data, audio and almost every other type of signal are the norm. This, in turn, has heightened the need for new channels of software and new versions of hardware worldwide. A couple of questions we focus on here: Is there any such thing as too much choice for today’s and tomorrow’s business and private, home-based consumer? Is more choice really necessary in order to drive more revenues?

Consolidation: Who would have ever thought one of the world’s oldest and largest media organizations, Time Warner (TW), would have become the subject of a buy out by an Internet’s concern about a fifth TW’s age, AOL? Or that DirecTV would eventually end up owning not only the logical candidate, U.S. Satellite Broadcasting, but also its fiercest rival, PrimeStar? Yet those alliances merely heighten the prospects for more — and more unusual – future consolidations. A question presented by this trend: Where’s the friction point between a business too big and one not big enough? Another key question here: When does a company cross the line where its size more impedes than helps the consumer?

Content: The Ruperts, the Johns, and the Teds of the telecom and media worlds have long recognized the value in owning content, especially when it comes to a fairly prompt return on the investment. Among many, examples include Murdoch buying the Fox Network and the L.A. Dodgers; Malone creating his special Liberty Media unit under the new AT&T umbrella; and Turner long ago making world class content the center of his broadcasting and telecom empires. A top question: What’s the next killer app in the content realm? Another important question: How do companies adequately (and properly) control their new content?

Cost: How do you price a Megatrend? Put another way, how do you tweak the cost model for a product or service that represents the next Megatrend? Do you double or triple the cost at retail and focus on the select few subscriber/purchasers, or do you seek the mass audience, and cut the margins? Or is the real answer a constantly-rotating mix of these two and innumerable variations?

Convergence: Not to be confused with Consolidation, this is the version of “getting together” that looks mostly at the issue of products, and occasionally services, that cross over into something else. Examples are so replete that one wonders when production begins on the next “Do Everything” set-top box or net appliance. What should companies be lining up to converge with today, in anticipation of minitrends that will forever alter business models?

Ultimately, bringing these Megatrends together under one roof is the challenge of just about every organization today. Bringing these five Megatrends (and many others) to the surface, and then attempting to answer the questions posed by their presentation, is the goal of 3MF and its global faculty. Key founding faculty presently include Washington, D.C.-based Internet Guru Gary Arlen; Washington, D.C.-based Satellite Expert Mickey Alpert; Ann Arbor, Mich.-based Wireless Wiz (and former U.S. Congressman) Wes Vivian; and the author.

For Satellite Communications readers wishing to see the core of these Megatrends presented, 3MF will be co-managing and conducting a General Session, entitled “Megatrends and Killer Apps,” at Phillips’ three-day Satellite 2001 event in Washington, DC, early next year.

Satellite renaissance

In case you’ve been wondering who to thank (or strangle) for today’s monster broadband, look no further than the satellite industry. Okay, fiber gets some credit, too. But it’s a relative newcomer to the broadband market when compared to satellite.

Satellites were the original, curiously strong broadband. Proto-broadband, if you will.

For many, many moons, dumb satellites have served smart broadband to panoply of bandwidth-gobbling applications. Broadcasting video, audio and data is the most obvious example of how satellite services deliver broadband solutions. More recently, we’ve watched satellite broadband serve ISP markets. And it looks as though streaming video and audio will call upon satellite broadband.

Satellites offer a solution for broadband applications.

The term broadband combines those tasty elements of a digital broadcast world with the spice and verve of the Internet. Broadband is far more rooted in reality than convergence. And it kicks multimedia’s butt. Broadband represents a super-evolution of communications.

I don’t want to wrap broadband in too much poetry, but here’s the deal: Twenty years ago, satellites didn’t sport the broadband label, but by today’s standards, broadband is precisely the market satellites have served. And now broadband is it. It’s in. It’s hip. And we’ve got it.

The satellite industry is entering a new phase. Over the course of 20 years, we at Satellite Communications have watched the satellite industry morph from an equipment-based industry to a services-based industry. The vague cache of building satellite busses has given way to a world obsessed with network services.

So I think I can say with a high degree of certainty that, thanks to the Internet, satellite services are almost sexy.

Gone are the days when the satellite industry operated under an air of protectionism and a philosophy of “us satellites vs. them fibers” or “us satellites vs. everyone else.” It’s a whole new attitude. And it’s going to be epic.

Bye-bye military mentality. Hello broadband.

The new era is not about communications via satellite; it’s about broadband access. And satellites figure prominently in the new landscape.

Broadband is about having a healthy disrespect for authority and integrating satellites into fiber; and combining broadcasting and the Internet. It’s about improving business through the addition of satellite capacity. It’s about understanding entrepreneurs. It’s about exploiting satellite resources. It’s about satellite solutions.

And so after waxing poetic about broadband for far too long, I would like to introduce you to the super-evolution of this magazine. Satellite Communications magazine staffers are all fluffed up with excitement over our new name, new look and new editorial. Beginning next month, Satellite Communications becomes Satellite Broadband.

To support our changes in both content and design, we have built an ace staff: Associate editor Peter Jakel joins us from the newspaper world. Award-winning designer Jani Duncan will capture the look and feel of broadband as our new art director.

It is truly a renaissance for satellite services and Satellite Communications.