FFmpeg/doc/avserver.texi
Mans Rullgard ada51a334a avserver: remove daemon mode
This code spews a multitude of warnings with glibc (unchecked
return values), some of them possibly warranted.  Furthermore,
the deamonisation is not suitable for use with typical startup
scripts as it does not provide the PID of the daemon in any way.
Users wishing to run avserver as a daemon can still do so using
start-stop-daemon or equivalent tools.

Signed-off-by: Mans Rullgard <mans@mansr.com>
Signed-off-by: Janne Grunau <janne-libav@jannau.net>
2012-11-15 17:36:14 +01:00

277 lines
9.5 KiB
Plaintext

\input texinfo @c -*- texinfo -*-
@settitle avserver Documentation
@titlepage
@center @titlefont{avserver Documentation}
@end titlepage
@top
@contents
@chapter Synopsys
The generic syntax is:
@example
@c man begin SYNOPSIS
avserver [options]
@c man end
@end example
@chapter Description
@c man begin DESCRIPTION
WARNING: avserver is unmaintained, largely broken and in need of a
complete rewrite. It probably won't work for you. Use at your own
risk.
avserver is a streaming server for both audio and video. It supports
several live feeds, streaming from files and time shifting on live feeds
(you can seek to positions in the past on each live feed, provided you
specify a big enough feed storage in avserver.conf).
This documentation covers only the streaming aspects of avserver /
avconv. All questions about parameters for avconv, codec questions,
etc. are not covered here. Read @file{avconv.html} for more
information.
@section How does it work?
avserver receives prerecorded files or FFM streams from some avconv
instance as input, then streams them over RTP/RTSP/HTTP.
An avserver instance will listen on some port as specified in the
configuration file. You can launch one or more instances of avconv and
send one or more FFM streams to the port where avserver is expecting
to receive them. Alternately, you can make avserver launch such avconv
instances at startup.
Input streams are called feeds, and each one is specified by a <Feed>
section in the configuration file.
For each feed you can have different output streams in various
formats, each one specified by a <Stream> section in the configuration
file.
@section Status stream
avserver supports an HTTP interface which exposes the current status
of the server.
Simply point your browser to the address of the special status stream
specified in the configuration file.
For example if you have:
@example
<Stream status.html>
Format status
# Only allow local people to get the status
ACL allow localhost
ACL allow 192.168.0.0 192.168.255.255
</Stream>
@end example
then the server will post a page with the status information when
the special stream @file{status.html} is requested.
@section What can this do?
When properly configured and running, you can capture video and audio in real
time from a suitable capture card, and stream it out over the Internet to
either Windows Media Player or RealAudio player (with some restrictions).
It can also stream from files, though that is currently broken. Very often, a
web server can be used to serve up the files just as well.
It can stream prerecorded video from .ffm files, though it is somewhat tricky
to make it work correctly.
@section What do I need?
I use Linux on a 900 MHz Duron with a cheapo Bt848 based TV capture card. I'm
using stock Linux 2.4.17 with the stock drivers. [Actually that isn't true,
I needed some special drivers for my motherboard-based sound card.]
I understand that FreeBSD systems work just fine as well.
@section How do I make it work?
First, build the kit. It *really* helps to have installed LAME first. Then when
you run the avserver ./configure, make sure that you have the
@code{--enable-libmp3lame} flag turned on.
LAME is important as it allows for streaming audio to Windows Media Player.
Don't ask why the other audio types do not work.
As a simple test, just run the following two command lines where INPUTFILE
is some file which you can decode with avconv:
@example
./avserver -f doc/avserver.conf &
./avconv -i INPUTFILE http://localhost:8090/feed1.ffm
@end example
At this point you should be able to go to your Windows machine and fire up
Windows Media Player (WMP). Go to Open URL and enter
@example
http://<linuxbox>:8090/test.asf
@end example
You should (after a short delay) see video and hear audio.
WARNING: trying to stream test1.mpg doesn't work with WMP as it tries to
transfer the entire file before starting to play.
The same is true of AVI files.
@section What happens next?
You should edit the avserver.conf file to suit your needs (in terms of
frame rates etc). Then install avserver and avconv, write a script to start
them up, and off you go.
@section Troubleshooting
@subsection I don't hear any audio, but video is fine.
Maybe you didn't install LAME, or got your ./configure statement wrong. Check
the avconv output to see if a line referring to MP3 is present. If not, then
your configuration was incorrect. If it is, then maybe your wiring is not
set up correctly. Maybe the sound card is not getting data from the right
input source. Maybe you have a really awful audio interface (like I do)
that only captures in stereo and also requires that one channel be flipped.
If you are one of these people, then export 'AUDIO_FLIP_LEFT=1' before
starting avconv.
@subsection The audio and video lose sync after a while.
Yes, they do.
@subsection After a long while, the video update rate goes way down in WMP.
Yes, it does. Who knows why?
@subsection WMP 6.4 behaves differently to WMP 7.
Yes, it does. Any thoughts on this would be gratefully received. These
differences extend to embedding WMP into a web page. [There are two
object IDs that you can use: The old one, which does not play well, and
the new one, which does (both tested on the same system). However,
I suspect that the new one is not available unless you have installed WMP 7].
@section What else can it do?
You can replay video from .ffm files that was recorded earlier.
However, there are a number of caveats, including the fact that the
avserver parameters must match the original parameters used to record the
file. If they do not, then avserver deletes the file before recording into it.
(Now that I write this, it seems broken).
You can fiddle with many of the codec choices and encoding parameters, and
there are a bunch more parameters that you cannot control. Post a message
to the mailing list if there are some 'must have' parameters. Look in
avserver.conf for a list of the currently available controls.
It will automatically generate the ASX or RAM files that are often used
in browsers. These files are actually redirections to the underlying ASF
or RM file. The reason for this is that the browser often fetches the
entire file before starting up the external viewer. The redirection files
are very small and can be transferred quickly. [The stream itself is
often 'infinite' and thus the browser tries to download it and never
finishes.]
@section Tips
* When you connect to a live stream, most players (WMP, RA, etc) want to
buffer a certain number of seconds of material so that they can display the
signal continuously. However, avserver (by default) starts sending data
in realtime. This means that there is a pause of a few seconds while the
buffering is being done by the player. The good news is that this can be
cured by adding a '?buffer=5' to the end of the URL. This means that the
stream should start 5 seconds in the past -- and so the first 5 seconds
of the stream are sent as fast as the network will allow. It will then
slow down to real time. This noticeably improves the startup experience.
You can also add a 'Preroll 15' statement into the avserver.conf that will
add the 15 second prebuffering on all requests that do not otherwise
specify a time. In addition, avserver will skip frames until a key_frame
is found. This further reduces the startup delay by not transferring data
that will be discarded.
* You may want to adjust the MaxBandwidth in the avserver.conf to limit
the amount of bandwidth consumed by live streams.
@section Why does the ?buffer / Preroll stop working after a time?
It turns out that (on my machine at least) the number of frames successfully
grabbed is marginally less than the number that ought to be grabbed. This
means that the timestamp in the encoded data stream gets behind realtime.
This means that if you say 'Preroll 10', then when the stream gets 10
or more seconds behind, there is no Preroll left.
Fixing this requires a change in the internals of how timestamps are
handled.
@section Does the @code{?date=} stuff work.
Yes (subject to the limitation outlined above). Also note that whenever you
start avserver, it deletes the ffm file (if any parameters have changed),
thus wiping out what you had recorded before.
The format of the @code{?date=xxxxxx} is fairly flexible. You should use one
of the following formats (the 'T' is literal):
@example
* YYYY-MM-DDTHH:MM:SS (localtime)
* YYYY-MM-DDTHH:MM:SSZ (UTC)
@end example
You can omit the YYYY-MM-DD, and then it refers to the current day. However
note that @samp{?date=16:00:00} refers to 16:00 on the current day -- this
may be in the future and so is unlikely to be useful.
You use this by adding the ?date= to the end of the URL for the stream.
For example: @samp{http://localhost:8080/test.asf?date=2002-07-26T23:05:00}.
@c man end
@chapter Options
@c man begin OPTIONS
@include avtools-common-opts.texi
@section Main options
@table @option
@item -f @var{configfile}
Use @file{configfile} instead of @file{/etc/avserver.conf}.
@item -n
Enable no-launch mode. This option disables all the Launch directives
within the various <Stream> sections. Since avserver will not launch
any avconv instances, you will have to launch them manually.
@item -d
Enable debug mode. This option increases log verbosity, directs log
messages to stdout.
@end table
@c man end
@ignore
@setfilename avserver
@settitle avserver video server
@c man begin SEEALSO
avconv(1), avplay(1), avprobe(1), the @file{avserver.conf}
example and the Libav HTML documentation
@c man end
@c man begin AUTHORS
The Libav developers
@c man end
@end ignore
@bye