Skip to content

Commit

Permalink
Fixes for the internal release (#36)
Browse files Browse the repository at this point in the history
* update readme and examples

* bump webrtc plugin

* prevent CLI from ignoring duplicated switches

* example html fix

* README and examples improvements

* temporairly remove RTSP example

* update deps
  • Loading branch information
mat-hek authored Sep 11, 2024
1 parent 7a65ea0 commit f17e24f
Show file tree
Hide file tree
Showing 19 changed files with 476 additions and 312 deletions.
92 changes: 76 additions & 16 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,34 +4,94 @@
[![API Docs](https://img.shields.io/badge/api-docs-yellow.svg?style=flat)](https://hexdocs.pm/boombox)
[![CircleCI](https://circleci.com/gh/membraneframework/boombox.svg?style=svg)](https://circleci.com/gh/membraneframework/boombox)

Boombox is a powerful tool for audio & video streaming based on the [Membrane Framework](https://membrane.stream).
Boombox is a high-level tool for audio & video streaming tool based on the [Membrane Framework](https://membrane.stream).

## Installation

The package can be installed by adding `boombox` to your list of dependencies in `mix.exs`:
The code below receives a stream via RTMP and sends it over HLS:

```elixir
def deps do
[
{:boombox, "~> 0.1.0"}
]
end
Boombox.run(input: "rtmp://localhost:5432", output: "index.m3u8")
```

## Usage
you can use CLI interface too:

```
boombox -i "rtmp://localhost:5432" -o "index.m3u8"
```

See `examples.livemd` for usage examples.
And the code below generates a video with bouncing Membrane logo and sends it over WebRTC:

## CLI app
```elixir
Mix.install([{:boombox, github: "membraneframework-labs/boombox"}, :req, :image])

overlay =
Req.get!("https://avatars.githubusercontent.com/u/25247695?s=200&v=4").body
|> Vix.Vips.Image.new_from_buffer()
|> then(fn {:ok, img} -> img end)
|> Image.trim!()
|> Image.thumbnail!(100)

bg = Image.new!(640, 480, color: :light_gray)
max_x = Image.width(bg) - Image.width(overlay)
max_y = Image.height(bg) - Image.height(overlay)

Stream.iterate({_x = 300, _y = 0, _dx = 1, _dy = 2, _pts = 0}, fn {x, y, dx, dy, pts} ->
dx = if (x + dx) in 0..max_x, do: dx, else: -dx
dy = if (y + dy) in 0..max_y, do: dy, else: -dy
pts = pts + div(Membrane.Time.seconds(1), _fps = 60)
{x + dx, y + dy, dx, dy, pts}
end)
|> Stream.map(fn {x, y, _dx, _dy, pts} ->
img = Image.compose!(bg, overlay, x: x, y: y)
%Boombox.Packet{kind: :video, payload: img, pts: pts}
end)
|> Boombox.run(
input: {:stream, video: :image, audio: false},
output: {:webrtc, "ws://localhost:8830"}
)
```

To build a CLI app, clone the repo and run
To receive WebRTC/HLS from boombox in a browser or send WebRTC from a browser to boombox
you can use simple HTML examples in the `boombox_examples_data` folder, for example

```
mix deps.get
./build_binary.sh
wget https://raw.githubusercontent.com/membraneframework-labs/boombox/dev/boombox_examples_data/webrtc_to_browser.html
open webrtc_to_browser.html
```

It should generate a single executable called `boombox`, which you can run.
For more examples, see `examples.livemd`.

### Supported formats

format | direction
---|---
MP4 | input, output
WebRTC | input, output
RTMP | input
HLS | output
Elixir Stream | input, output

## Installation

To use Boombox as an Elixir library, add

```elixir
{:boombox, github: "membraneframework-labs/boombox"}
```

to your dependencies or `Mix.install`.

to use via CLI, run the following:

```
wget https://raw.githubusercontent.com/membraneframework-labs/boombox/dev/bin/boombox
chmod u+x boombox
./boombox
```

Make sure you have [Elixir](https://elixir-lang.org/) installed. The first call to `boombox` will install it in a default directory in the system. The directory can be set with `MIX_INSTALL_DIR` env variable if preferred.

## CLI API

The CLI API is similar to the Elixir API, for example:

Expand All @@ -57,7 +117,7 @@ The first run of the CLI may take longer than usual, as the necessary artifacts

## Copyright and License

Copyright 2020, [Software Mansion](https://swmansion.com/?utm_source=git&utm_medium=readme&utm_campaign=boombox)
Copyright 2024, [Software Mansion](https://swmansion.com/?utm_source=git&utm_medium=readme&utm_campaign=boombox)

[![Software Mansion](https://logo.swmansion.com/logo?color=white&variant=desktop&width=200&tag=membrane-github)](https://swmansion.com/?utm_source=git&utm_medium=readme&utm_campaign=boombox)

Expand Down
2 changes: 2 additions & 0 deletions bin/boombox
Original file line number Diff line number Diff line change
@@ -0,0 +1,2 @@
#/bin/sh
elixir -e 'Logger.configure(level: :info);Mix.install([{:boombox, github: "membraneframework-labs/boombox"}]);Boombox.run_cli()' $@
26 changes: 26 additions & 0 deletions boombox_examples_data/hls.html
Original file line number Diff line number Diff line change
@@ -0,0 +1,26 @@
<!DOCTYPE html>
<html lang="en" style="font-family: arial; color: white; background-color: black; margin: 0;">

<body>
<script src="https://cdn.jsdelivr.net/npm/hls.js@latest"></script>
<script>
function play() {
var video = document.getElementById('player');
var hls = new Hls();
hls.loadSource(document.querySelector("#source").value);
hls.attachMedia(video);
}
</script>
<h1>Boombox HLS Example</h1>
Source:<br>
<input type="text" id="source" value="output/index.m3u8" style="width:400px">&ensp;
<button id="play" onclick="play()">Play</button><br><br>
<video id="player" autoplay muted controls></video>
<script>
const source = document.querySelector("#source");
source.value = new URLSearchParams(window.location.search).get("src") || "output/index.m3u8";
play();
</script>
</body>

</html>
97 changes: 97 additions & 0 deletions boombox_examples_data/webrtc_from_browser.html
Original file line number Diff line number Diff line change
@@ -0,0 +1,97 @@
<!DOCTYPE html>
<html lang="en">

<head>
<meta charset="UTF-8">
<meta name="viewport" content="width=device-width, initial-scale=1.0">
<meta http-equiv="X-UA-Compatible" content="ie=edge">
<title>Boombox stream WebRTC from browser example</title>
</head>

<body
style="background-color: black; color: white; font-family: Arial, Helvetica, sans-serif; min-height: 100vh; margin: 0px; padding: 5px 0px 5px 0px">
<main>
<h1>Boombox stream WebRTC from browser example</h1>
<div>
Boombox URL: <input type="text" value="ws://localhost:8829" id="url" /> <button id="button">Connect</button>
</div>
<div id="status"></div>
<br>
<video id="preview" autoplay muted></video>
</main>

<script>
const pcConfig = { 'iceServers': [{ 'urls': 'stun:stun.l.google.com:19302' },] };
const mediaConstraints = { video: { width: 640, height: 480 }, audio: true };
const button = document.getElementById("button");
const connStatus = document.getElementById("status");
const preview = document.getElementById("preview");
const url = document.getElementById("url");

const connectRTC = async (ws) => {
const localStream = await navigator.mediaDevices.getUserMedia(mediaConstraints);
preview.srcObject = localStream;
const pc = new RTCPeerConnection(pcConfig);

pc.onicecandidate = event => {
if (event.candidate === null) return;
console.log("Sent ICE candidate:", event.candidate);
ws.send(JSON.stringify({ type: "ice_candidate", data: event.candidate }));
};

pc.onconnectionstatechange = () => {
if (pc.connectionState == "connected") {
button.innerHTML = "Disconnect";
button.onclick = () => {
ws.close();
localStream.getTracks().forEach(track => track.stop())
button.onclick = connect;
button.innerHTML = "Connect";
}
connStatus.innerHTML = "Connected ";
}
}

for (const track of localStream.getTracks()) {
pc.addTrack(track, localStream);
}

ws.onmessage = async event => {
const { type, data } = JSON.parse(event.data);

switch (type) {
case "sdp_answer":
console.log("Received SDP answer:", data);
await pc.setRemoteDescription(data);
break;
case "ice_candidate":
console.log("Recieved ICE candidate:", data);
await pc.addIceCandidate(data);
break;
}
};

const offer = await pc.createOffer();
await pc.setLocalDescription(offer);
console.log("Sent SDP offer:", offer)
ws.send(JSON.stringify({ type: "sdp_offer", data: offer }));
};

const connect = () => {
connStatus.innerHTML = "Connecting..."
const ws = new WebSocket(url.value);
ws.onopen = _ => connectRTC(ws);
ws.onclose = event => {
connStatus.innerHTML = "Disconnected"
button.onclick = connect;
button.innerHTML = "Connect";
console.log("WebSocket connection was terminated:", event);
}
}

button.onclick = connect;
</script>

</body>

</html>
70 changes: 70 additions & 0 deletions boombox_examples_data/webrtc_to_browser.html
Original file line number Diff line number Diff line change
@@ -0,0 +1,70 @@
<!DOCTYPE html>
<html lang="en">

<head>
<meta charset="UTF-8">
<meta name="viewport" content="width=device-width, initial-scale=1.0">
<meta http-equiv="X-UA-Compatible" content="ie=edge">
<title>Boombox stream WebRTC to browser example</title>
</head>

<body
style="background-color: black; color: white; font-family: Arial, Helvetica, sans-serif; min-height: 100vh; margin: 0px; padding: 5px 0px 5px 0px">
<main>
<h1>Boombox stream WebRTC to browser example</h1>
<div>
Boombox URL: <input type="text" value="ws://localhost:8830" id="url" /> <button id="button">Connect</button>
</div>
<br>
<video id="videoPlayer" controls muted autoplay></video>
</main>
<script>
const pcConfig = { 'iceServers': [{ 'urls': 'stun:stun.l.google.com:19302' },] };
const button = document.getElementById("button");
const connStatus = document.getElementById("status");
const url = document.getElementById("url");
const videoPlayer = document.getElementById("videoPlayer");

const connectRTC = async (ws) => {
videoPlayer.srcObject = new MediaStream();

const pc = new RTCPeerConnection(pcConfig);
pc.ontrack = event => videoPlayer.srcObject.addTrack(event.track);
videoPlayer.play();
pc.onicecandidate = event => {
if (event.candidate === null) return;

console.log("Sent ICE candidate:", event.candidate);
ws.send(JSON.stringify({ type: "ice_candidate", data: event.candidate }));
};

ws.onmessage = async event => {
const { type, data } = JSON.parse(event.data);

switch (type) {
case "sdp_offer":
console.log("Received SDP offer:", data);
await pc.setRemoteDescription(data);
const answer = await pc.createAnswer();
await pc.setLocalDescription(answer);
ws.send(JSON.stringify({ type: "sdp_answer", data: answer }));
console.log("Sent SDP answer:", answer)
break;
case "ice_candidate":
console.log("Recieved ICE candidate:", data);
await pc.addIceCandidate(data);
}
};
};

const connect = () => {
const ws = new WebSocket(url.value);
ws.onopen = () => connectRTC(ws);
ws.onclose = event => console.log("WebSocket connection was terminated:", event);
}

button.onclick = connect;
</script>
</body>

</html>
Loading

0 comments on commit f17e24f

Please sign in to comment.