All posts by admin

Webcast

A webcast is a broadcast that – unlike a web video – is broadcast live (livestreaming) over the Internet. A broadcast may be pre-recorded. Webcasts use streaming media. A webcast that only hears sound is called audio webcast.

Webcast video conference

Technique

A webcast requires: a camera or video signal, an encoder, an on-site Internet connection and a streaming server. The encoder is often a powerful PC with firewire input, or a special video capture card, to which a video source can be connected. Streaming tracking software runs on the computer.

QuickTime Broadcaster is available for free for the Mac and can be coded live in the QuickTime format, MPEG-4 format and 3GPP format (for mobile phones). Windows Media encoder is available for free for Windows XP and can be encoded live in the Windows Media format.

Because the on-site Internet connection is in virtually no way fast enough to control multiple viewers, the stream, which comes from the encoder, is reflected via a streaming server. This can be done by means of a PUSH or in some cases by means of a PULL. In a push, the encoder initiates the stream to the server. QuickTime Broadcaster sets the server’s IP address, as well as the audio and video port on the server. The user then generates a Session Description Protocol (. SDP) file and places it on the streaming server. In Windows Media encoder, the user specifies the name of the server and their name and password.

Depending on the audience to be reached, there is a difference between a broadcast, multicast, narrowcasting, unicast and atnycast. This influences the choice of certain techniques.

Podcast

A podcast is an audio broadcast where the on-demand sound file is provided through web feeds. Thanks to the introduction of portable mp3 players such as the iPod, this form of broadcasting was quickly popular among radio amateurs. The term podcast is therefore a contraction of iPod and broadcast.

Podcast

History

In 1997, Dave Winer described a way to include references to media files in RSS feeds. The method was then mainly used for web syndication. Podcasts became popular because of
(a) the breakthrough of broadband connections to consumers;
(b) the arrival of portable mp3 players with sufficient storage capacity and
(c) the advent of applications for listening to a portable media player or mobile phone.

The term podcasting first surfaced in 2004 when journalist Ben Hammersley described the new broadcasting technique. Dannie J. Gregoire adopted this term, after which Adam Curry popularized podcasting as something revolutionary.

Technique

For a podcast, the on-demand sound file is provided through web feeds. The webfeed contains a reference to the media file. An application that checks this file for changes and can be instructed to download new files automatically is called a podcast aggregator or podcatcher. Often this functionality is integrated into a media player such as iTunes. There are also podcast aggregators for smartphones. Some of these applications also offer their own guide with podcasts to which one can subscribe. There are also websites with a podcast guide.

Copyright

For the distribution of copyrighted music there are legal rules, The provision of music may only be made available with the consent of the authors.

When the artist does not want to claim his music rights, the music is Podsafe. This corresponds to a Creative Commons license.

Popularity

Podcasts gave radio amateurs a new platform with a wider reach. In 2005, podcasts created by hobbyists peaked. The emerging competition of professional radio makers – with public service broadcasting at the forefront – marginalises the number of active hobbyists.

Video podcast

The term podcast is also used for broadcasts or files with image (video podcast, compare VODcast), which are also suitable for listening only. In addition to the ability to stream or play the video and watch it, videos called podcast often include the ability to do so only with the audio on another website. Sometimes a podcast host gives a little explanation of what’s on display for the benefit of those who only listen to the podcast.

Uploading

Uploading is computer jargon for sending files or other data from one computer to another, taking the initiative of the sending computer. The sender is called client, the receiver is called server. If you want to make data from a local computer accessible to the Internet, for example X-Stream.co.uk, it is done with an upload, but sending e-mail with SMTP is also an upload.

Upload

Also, transferring files from external data sources such as a CD-ROM or a digital camera to your own computer can be referred to as uploading. The convention is used to refer to uploading when there is data transfer from a small medium (e.g. a CD-ROM) to a large medium (e.g. the computer). For downloading, the opposite is true.

Upload from a browser

For uploading files from a form in a web browser (via HTTP), W3C formulated a standard encoding: multipart/form data. This encoding allows one or more files to be sent to the server in one request together with other form elements. On the server, the requested request can be dissected in the original data and files.

In the early days of the global web, it was not possible to upload a file from the browser. For Internet Explorer version 3, a special add-on had to be installed to make this possible. Netscape supports file upload from version 3. As of 1997, file upload has been built into browsers by default.

Upload with FTP

You can also upload a file via FTP. This requires an FTP client (to send the file) and an FTP server (to receive the file).
Upload vs. download

The reverse process: transfer or copy files from a server to the client is called download.

The result of both uploading and downloading is that a file is copied from one computer to another. Upload and download the terms are applied from the client’s perspective. The program that starts the activity is the client, the program that allows the activity (or possibly refuses, for example, if an erroneous login is used) is the server.

P2P programs like Kazaa are both client (when downloaded to the local computer) and server (file provider for download). Then the client is uploading and downloading.

Download VS. stream

Downloading and streaming are several basic technologies to distribute media over the Internet. Streaming evolved from downloading and makes it possible to consume media regardless of the point at which it starts receiving.

Download

  • Classical downloading – bringing in a database in its entirety before it can be played;
  • Progressive downloading – where a data file can be played before it is fully obtained;

Stream

  • Livestreaming – where the sender steers and the receiver can start consuming at any moment. Also known as webcast;
  • On-demand streaming – where the sender sends as soon as the recipient asks (as with video on demand and web videos).

Pros and cons

Every technique has advantages. The trade-off for a technique is mainly about the preference between the continuity of reception or the quality of the file transfer. The classic way of downloading only occurs when collecting files for which it is necessary that they are received without defects. Communication often takes place through the TCP protocol, which is designed for high reliability in poor connections. Sometimes continuity is more important than quality. For listening to conversations, online multiplayer games, or viewing footage, therefore, streaming is often chosen, often communicated via the UDP protocol. Some protocols even dynamically adjust quality based on continuity during the connection. Thanks to streaming, the receiver (a) can start consuming immediately and (b) start at any point.

Other benefits of streaming include:

  • More efficient use of server capacity
  • Data traffic is only generated when it is really necessary.
  • Better analysis of viewing behaviour
  • Because the receiver consumes at the time it downloads, the viewing behavior can be tracked very accurately. The information in these logs is essential for analysis of viewing behavior, trends and technical usage data for programme makers, advertisers, broadcasters, Internet providers, digital video stores and helpdesks.
  • Virtual assembly
  • Using a technique called bursting, it is possible to virtually assemble fragments from files in succession. For end users, the advantages are that one does not have to wait (actual on demand) and can use the file as if it is stored locally.
  • Better source protection
  • Because no complete files are stored with the recipient, it is less easy to illegally store copies.

Downloading is especially efficient if the file is consumed frequently, or with an Internet connection that is too slow to stream.

Network peer to peer

Peer-to-Peer

Peer-to-peer (P2) makes it possible to receive data (which belongs together) from different shippers via the Point-2-Point protocol. This is possible with both downloading and streaming. The big advantage is that the central server is relieved and data transfer is more efficient via fast (or more nearby) routes than those from the server. A disadvantage is that front and reverse coils become more unreliable. Another disadvantage is that distribution is no longer controlable or measureable.

Future

The general expectation was that around 2015 virtually all audiovisual content will be distributed via IP: as a download, VOD stream or as an IPTV stream. Analogue distribution over the air and cable will disappear first. The different DVB variants will migrate to IP. For example, DVB-IP is an MPEG-4 stream, but with a DVB naming to accelerate adoption in the cable industry. Television distribution with peer-to-peer technology is also seen as a possible contender, although current trials show that lesser-known titles are poorly distributed and in particular that the quality and reliability of the signals is far below par.

Few Internet service providers support multicasting and unicasting is used instead. Despite the larger data volumes of unicasting, livestreams – as opposed to p2p solutions – can be distributed well thanks to decentralized distribution and even QoS (quality guarantees) can be designed on availability and reliability.