-->
Save your seat for Streaming Media NYC this May. Register Now!

Serving suggestion

The death of the mainframe has been prophesied for years. The 1980s initiative to power up the desktop started a move away from centralised to distributed, client/server computing. But the hardware industry is about the witness a sea change, and streaming media is the driving force.

'The focus now is on content delivery and how to speed it up.' Says Tim Jennings, analyst at Butler Group.

But given the restrictions of today's internet technology, it's not a simple problem to solve. Firstly, there's bandwidth (or lack thereof). Then, there's slow and unreliable connection, as well as congestion and fluctuations in demand. As Jennings points out, streamed material is as yet an unknown quantity.

These problems have ignited a flurry of activity in the server market. Manufacturers, such as IBM and Sun Microsystems, have recently introduced high-end servers, pushing the barriers of high-performance computing to new limits.

Sun Fire 15K contains 900MHz, Ultrasparc III processors with speeds of up to 750MHz. It has up to 106 high-speed processors and boasts more than half a terabyte of memory. Sun maintains that it can be partitioned into 18 domains, each one handling a different job. And, they say, it can run code 71 times faster than a single chip box.

IBM's offering is the long-awaited pSeries 690 (codenamed, Regatta), a 32-way system employing the company's much-hyped Power 4 processor. The company says that users can create both Linux and AIX (IBM's flavour of UNIX) logical partitions via a 'hypervisor' facility.

High specifications pack a punch in terms of horsepower, which is what is needed for streaming in certain applications, but not all. Some would argue that users get much more from multiple, smaller systems.

But the fact that IBM and Sun have launched such powerful beasts indicates a move back to centralised computing. As to whether or not this is a good thing is still in debate.

The fact that the big players are sparring with each other for a bigger share of the market is good news for users lower down the scale. Many features, such as diagnostics and automated repair, are being pushed down into mid and lower range products.

Jennings points out that for smaller business applications or content providers wanting to test the market, a more flexible approach might be to employ smaller systems that cost much less to build. Another trend that analysts have spotted is the move towards outsourcing.

'You need the big servers to meet the demand for a finite number of connections.' Jennings explains. 'But these are million dollar, entry-level machines and not every business has that kind of money for a new venture.'

Craig Robertson from Fujitsu Siemens' Broadband Solutions division warns of the potential security issues associated with outsourcing. 'If what is being streamed is not company confidential, nor valuable then it's not an issue but if content is also a revenue stream (such as Video-on-Demand) it could be.'

The performance benefits of Sun and IBM's offerings cannot be ignored. 'There's a place for them.' Says Jennings. 'But you have to consider what the applications for streaming media are.'

On the business side, it is thought that streaming will be used for deploying corporate communications and e-learning, handled by smaller, distributed networks. The consumer side is where the big, high-end servers will be needed for entertainment portals and such like.

Robertson says that the difference between business and consumer markets where streaming media is concerned is less to do with technology and more about business models. 'The underlying server technology is the same,' he says. 'But it's all about bandwidth as far as the network is concerned.'

In general, customers are used to dialling in on a 56K connection, and, depending on the number of hops between routers, the level of systems performance and availability is pot luck. Broadband internet is a different beast. Streaming video is where it becomes complex -- given the restrictions of current network technology. Broadcast quality demands speeds of 20-40Mgbits, high-bandwidth internet currently offers 2-4Mgbits so streaming media is about pushing the boundaries. That's where server architectures come in.

Servers come in two parts: the hardware and the software. There's a vast range of server hardware available, from low-end, low powered boxes (typically installed in simple office, business administration type networks), right up to the high-powered workhorses of the Sun fire 15K and pSeries 690 variety. The more the machine costs, the more powerful, resilient and intelligent it will be. Mid-range offerings, like Fujitsu Siemens' X86 Primergy are typically Intel-based.

On the software side are products like Windows Media, which also handles security issues and database, application and clustering software.

Server architectures need to be scalable to allow for business growth and fluctuations in usage. Availability and reliability must also be built in, especially if the application is mission critical. Certain servers are fault tolerant and adhere to the five nines (99.999%) reliability standard, but content providers are now demanding 100% uptime and performance. Distributed servers with 'fail-over' built in (manufacturers claim) can be networked in such a way as to offer 100% reliability. Security also has to be built in as well as a system of load balancing.

Robertson says there is little that can be done with a server to directly overcome bandwidth problems. So, those businesses wanting to stream corporate data and e-learning, utilising low-level internet, for example, will have to build distributed server architectures locating the servers as near as possible to the end user.

Robertson explains: 'The difference between centralised server architectures and distributed is that tasks are split and separated using the pipe that connects them. If a business has enough available bandwidth (such as Kingston Communications operating on a high-speed ADSL network) you don't need to distribute content servers. But, if bandwidth is a problem, servers need to be distributed closer to the end customer.'

So serving streaming is really about building a bridge between content and end-users. It depends on where both are located and how far apart as to how many servers are needed and how powerful they must to be. But, for the time being at least, it is also about compensating for the limits of current internet technology.

Streaming Covers
Free
for qualified subscribers
Subscribe Now Current Issue Past Issues