A service that hooks into the SRS streaming server and uploads it's HLS feed to an S3 bucket. For convenience it is set up to be bundled with the srs docker image for a self contained solution.
Go to file
2023-07-07 22:46:50 +02:00
conf Added docker based deployment 2023-07-07 22:46:50 +02:00
src Added docker based deployment 2023-07-07 22:46:50 +02:00
.dockerignore Added docker based deployment 2023-07-07 22:46:50 +02:00
.env.sample Added docker based deployment 2023-07-07 22:46:50 +02:00
.gitignore Initial commit 2023-07-07 22:46:50 +02:00
.prettierrc Initial commit 2023-07-07 22:46:50 +02:00
Dockerfile Added docker based deployment 2023-07-07 22:46:50 +02:00
entrypoint.sh Added docker based deployment 2023-07-07 22:46:50 +02:00
LICENSE Added docker based deployment 2023-07-07 22:46:50 +02:00
package-lock.json Added docker based deployment 2023-07-07 22:46:50 +02:00
package.json Added docker based deployment 2023-07-07 22:46:50 +02:00
README Added docker based deployment 2023-07-07 22:46:50 +02:00

# Overview

This project allows upload of HLS based streaming data from the SRS (Simple Realtime Server, https://ossrs.io/) to an S3 based storage. The purpose is to publish a stream in HLS format to a cloud based data store to leverage CDN distribution.

This project implements a NodeJs based webserver that provides a web hook that can be registered with SRS's `on_hls` webhook. Whenever a new video segment is created, this web hook is called and the implementation in this project uploads the `.ts` video segment as well as the `.m3u8` playlist information 
to the storage bucket.

To keep the bucket usage limited to a small amount of data, segments before a certain time frame (e.g. 60s) are automatically deleted from the bucket.

# Configuration

Create a `.env` file based on `.env.sample` with the S3 credentials

rtmp://localhost/live

any stream key

# Usage

```
docker build -t srs-s3 .
docker run -p 1935:1935 -it --rm srs-s3
```



# Known Limitation

* Currently only streams with 1 camera and 1 format are supported.
* This upload/sync job needs to run on the same machine as SRS, since data is read from the local hard disk. This is the reason it currently runs in the same docker container.