Updated readme

This commit is contained in:
florian 2023-07-07 22:46:50 +02:00
parent 9a335d1c11
commit 6d71fdc362
2 changed files with 64 additions and 31 deletions

31
README
View File

@ -1,31 +0,0 @@
# Overview
This project allows upload of HLS based streaming data from the SRS (Simple Realtime Server, https://ossrs.io/) to an S3 based storage. The purpose is to publish a stream in HLS format to a cloud based data store to leverage CDN distribution.
This project implements a NodeJs based webserver that provides a web hook that can be registered with SRS's `on_hls` webhook. Whenever a new video segment is created, this web hook is called and the implementation in this project uploads the `.ts` video segment as well as the `.m3u8` playlist information
to the storage bucket.
To keep the bucket usage limited to a small amount of data, segments before a certain time frame (e.g. 60s) are automatically deleted from the bucket.
# Configuration
Create a `.env` file based on `.env.sample` with the S3 credentials
rtmp://localhost/live
any stream key
# Usage
```
docker build -t srs-s3 .
docker run -p 1935:1935 -it --rm srs-s3
```
# Known Limitation
* Currently only streams with 1 camera and 1 format are supported.
* This upload/sync job needs to run on the same machine as SRS, since data is read from the local hard disk. This is the reason it currently runs in the same docker container.

64
README.md Normal file
View File

@ -0,0 +1,64 @@
# Overview
This project allows upload of HLS based streaming data from the SRS (Simple Realtime Server, https://ossrs.io/) to an S3 based storage. The purpose is to publish a stream in HLS format to a cloud based data store to leverage CDN distribution.
This project implements a NodeJs based webserver that provides a web hook that can be registered with SRS's `on_hls` webhook. Whenever a new video segment is created, this web hook is called and the implementation in this project uploads the `.ts` video segment as well as the `.m3u8` playlist information
to the storage bucket.
To keep the bucket usage limited to a small amount of data, segments before a certain time frame (e.g. 60s) are automatically deleted from the bucket.
# Configuration
## Cloudflare Setup
This configuration assumes that Cloudflare is used as an storage and CDN provider. Generally
any S3 compatible hosting service can be used.
1. First set up a Cloudflare Bucket, e.g. `streams`
2. Make the bucket publicly accessible, by connecting a domain name
3. When you have connected a domain name (and the proxy setting is selected in CloudFlare) the access is automatically cached by the CDN.
4. Set the CORS settings of the bucket to allow any client, i.e. host `*`
## Environment
Create a `.env` file based on `.env.sample` with the S3 credentials
* Endpoint for S3-compatible storage. Cloudflare uses an endpoint that contains the account ID.
```
S3_ENDPOINT=https://xxxxxxxxxxxxxxxxxxx.r2.cloudflarestorage.com
```
* Credentials for the S3 bucket to store the stream in.
```
S3_ACCESS_KEY_ID=xx
S3_ACCESS_KEY_SECRET=xxx
S3_BUCKET_NAME=streams
```
# Usage
Build the docker image
```
docker build -t srs-s3 .
```
With the current setup the environment files `conf/mysrs.conf` and `.env` are copied into
the docker container. Thats why it is needed to build the docker from source.
Run the docker image with the rtmp port accessible
```
docker run -p 1935:1935 -it --rm srs-s3
```
In a streaming application use the following settings, assuming the docker image is run on the local machine:
* Server: `rtmp://localhost`
* Stream Key: `123456` (any text you like)
When you start the stream, you will see the HLS data being uploaded to the S3 storage bucket. The stream will be accessible from the URL: `https://your.domain/123456/stream.m3u8`
# Known Limitations
* Currently only streams with 1 camera and 1 format are supported.
* This upload/sync job needs to run on the same machine as SRS, since data is read from the local hard disk. This is the reason it currently runs in the same docker container.