diff --git a/videoroom/1_Introduction.md b/videoroom/1_Introduction.md deleted file mode 100644 index d6c6d32..0000000 --- a/videoroom/1_Introduction.md +++ /dev/null @@ -1,48 +0,0 @@ -# Introduction - -The scope of this tutorial covers the process of creating your own video room with the use of the Membrane framework. - -## What are we doing here? - -It hasn't been that long ago when video rooms have become quite a common tool used in many fields of our life. We use them when we want to have an impression of meeting our beloved ones in the manner as they were sitting just next to us. We use them at work, to synchronize our work progress and exchange information between us and our colleagues. -Taking advantage of recent technological improvements and state-of-the-art tools introduced in the field of data transmission, video streaming -has become accessible to everyone at the scale not known previously. -But have you ever wondered how does a video room work under the hood? How is it possible for tools such as WebEx or Google Meets to be capable of streaming data between that many peers participating in the same video conversation? -Or maybe you have thought of creating your own video room but didn't know where to start? -If so, I will be pleased to invite you to take an exciting journey to the land of multimedia streaming and follow this tutorial. -At the end of the tutorial, you will have a fully functional video room implemented. -You will learn about many interesting aspects of media streaming as well as get familiar with the use of tools which make media streaming easier. - -## Expected result - -Here you can see how our application should work. We want to be able to open the web application, pass the name of the room to which we want to join and our own name and then join this room. In the room, we will be seeing and hearing all of the other users who have joined the given room.
-![Expected Result](assets/records/expected_result.webp)
- -## Prerequisites - -Since media streaming is quite a complex topic it would be great for you to know something about how the browser can fetch user's media, how the connection is made between peers etc. Since we will be using the Phoenix framework to create our application - it will be much easier for you to understand what's going on if you will be even slightly familiar with that framework. Take your time and glance over these links: - -- [How does Phoenix work?](https://hexdocs.pm/phoenix/request_lifecycle.html) - Phoenix, while being a great tool that allows creating a complex application in a considerably easy manner, requires its user to follow a bunch of good practices and use some helpful project patterns. The most important one is the MVC (Model-View-Controller) pattern, which affects the structure of project directories. The tutorial attached there provides a great introduction to Phoenix application creation and will allow you to understand the structure of our template project. - -- [How do Phoenix sockets work and the difference between endpoint and socket/channel?](https://hexdocs.pm/phoenix/channels.html) - When we think about building a web application the very first thing which comes to our mind is [HTTP](../glossary/glossary.md#http). - Surely, Phoenix allows us to send HTTP requests from the client application to the server - however, there is an optional way to communicate - which can also be used in Phoenix application - [sockets](https://datatracker.ietf.org/doc/html/rfc6455). - Sockets, in contrast to plain HTTP requests, are persistent and allow bidirectional communication, while HTTP requests are stateless and work in request -> reply mode. - Want to dig deeper? Feel free to read the provided part of the official Phoenix documentation! - -- [How to access user's media from the browser?](https://www.html5rocks.com/en/tutorials/webrtc/basics/) - Ever wondered how is it possible for the browser to access your camera or a microphone? Here you will find an answer for that and many more inquiring questions! - -- [WebRTC Connectivity (signaling, ICE etc.)](https://developer.mozilla.org/en-US/docs/Web/API/WebRTC_API/Connectivity) - One does not simply connect and send media! First, peers need to get in touch with each other (with a little help from a publicly available server), - as well as exchange some information about themselves. This short tutorial will give you an outlook on how this process (called '[signaling](../glossary/glossary.md#signaling)') can be performed! - -- [Why do we need STUN/TURN servers?](https://www.html5rocks.com/en/tutorials/webrtc/infrastructure/) - A peer-to-peer connection can be (and in most cases is) problematic. At the same time, it is also demanded - we don't want to have our media pass through some server - (both due to the throughput limitations and privacy issues). While reading this tutorial you will find some tricks which allow you to connect your beloved peer hidden - behind some firewalls and [NAT](../glossary/glossary.md#nat)! - -- [WebRTC architectures](https://medium.com/securemeeting/webrtc-architecture-basics-p2p-sfu-mcu-and-hybrid-approaches-6e7d77a46a66) - Take a quick glance there and find out what are possible architectures of WebRTC servers, when to use which architecture, and how to build a streaming solution that scales and behaves well. diff --git a/videoroom/2_EnvironmentPreparation.md b/videoroom/2_EnvironmentPreparation.md deleted file mode 100644 index e4ffa40..0000000 --- a/videoroom/2_EnvironmentPreparation.md +++ /dev/null @@ -1,145 +0,0 @@ -# Environment preparation - -## Elixir installation - -I don't think I can describe it any better: [How to install Elixir](https://elixir-lang.org/install.html). -But do not forget to add the elixir bin to your PATH variable! - -Take your time and make yourself comfortable with Elixir. Check if you can run Elixir's interactive terminal and if you can compile Elixir's source files with the Elixir compiler. -You can also try to create a new Mix project - we will be using [Mix](https://elixir-lang.org/getting-started/mix-otp/introduction-to-mix.html) as the build automation tool all along with the tutorial. - -## Template downloading - -Once we have the development environment set up properly (let's hope so!) we can start to work on our project. We don't want you to do it from scratch as the development requires some dull playing around with UI, setting the dependencies, etc. - we want to provide you only the meat! That is why we would like you to download the template project with core parts of the code missing. You can do it by typing: - -```bash -git clone https://github.com/membraneframework/membrane_videoroom_tutorial -``` - -and then changing directory to the freshly cloned repository and switching to the branch which provides the unfulfilled template: - -```bash -cd membrane_videoroom_tutorial -git checkout template/start -``` - -In case you find yourself lost along with the tutorial, feel free to check the suggested implementation provided by us, which is available on the `template/end` branch of this repository. - -## Native dependencies installing - -You will need `npm` version 9.0.0 or higher, which you can install from [node.js](https://nodejs.org/) - -Apart from that, some native dependencies are needed. Here is how you can install them and setup the required environment variables. - -### Mac OS with M1 - -```bash -brew install node srtp libnice clang-format ffmpeg -export C_INCLUDE_PATH=/opt/homebrew/Cellar/libnice/0.1.18/include:/opt/homebrew/Cellar/opus/1.3.1/include:/opt/homebrew/Cellar/openssl@1.1/1.1.1l_1/include -export LIBRARY_PATH=/opt/homebrew/Cellar/opus/1.3.1/lib -export PKG_CONFIG_PATH=/opt/homebrew/Cellar/openssl@1.1/1.1.1l_1/lib/pkgconfig/ -``` - -### Mac OS with Intel - -```bash -brew install node srtp libnice clang-format ffmpeg -export PKG_CONFIG_PATH="/usr/local/opt/openssl@1.1/lib/pkgconfig" -``` - -### Ubuntu - -```bash -sudo apt-get install npm build-essential pkg-config libssl-dev libopus-dev libsrtp2-dev libnice-dev libavcodec-dev libavformat-dev libavutil-dev -export PKG_CONFIG_PATH="/usr/local/ssl/lib/pkgconfig" -``` - -If you installed Elixir from ESL repo, make sure the following erlang packages are present - -```bash -sudo apt-get install erlang-dev erlang-parsetools erlang-src -``` - -### Setting environment with the use of Docker - -Alternatively to the steps described in the section above, you can make use of the docker image we have prepared for the purpose of this tutorial. -You won't need to install any native dependencies there nor export environmental variables - however **your computer cannot be running on M1 processor**. - -If you are using VS Code for your code development, you can make use of the [Remote - Containers](https://marketplace.visualstudio.com/items?itemName=ms-vscode-remote.remote-containers) extension. Among the files you have just cloned from the repository there should be a `.devcontainer.json` configuration file placed in the root directory of the project. It contains information about which image from the Docker Hub should be cloned and how the ports should be redirected between the container and the host. -After the installation of the Remote - Containers extension you will be able to start the container by clicking on the green button in the left-down corner of VS Code windows and the selecting "Reopen in the Container" option. -This will cause all the files in the project's root directory to be shared between your host OS and the container - therefore any changes made to them on your local machine will be immediately reflected in the container. -At the same time, you will be able to run the project from the inside of the container - with the terminal launched in the VS Code window (`Terminal -> New Terminal`). - -If you are not using VS Code, you can still take advantage of the virtualization and use the image provided by us - however, you will need to create the shared filesystem volume and bridge the networks on your own. Here is the command which will make this for you: - -```bash -docker run -e RUNNING_IN_DOCKER=1 -p 4000:4000 -p 50000-50019:50000-50019/udp -it -v :/videoroom membraneframeworklabs/docker_membrane -``` - -where `` is the **absolute** path to the root directory of the project on your local system. - -If you have just cloned the repo and your current directory is the repo's root, you can use `pwd` to get that path: - -```bash -docker run -e RUNNING_IN_DOCKER=1 -p 4000:4000 -p 50000-50019:50000-50019/udp -it -v `pwd`:/videoroom membraneframeworklabs/docker_membrane -``` - -After running the command, a container terminal will be attached to your terminal. You will be able to find the project code inside the container in the `/videoroom` directory. - -## What do we have here? - -Let's make some reconnaissance. -First, let's run the template. -Before running the template we need to install the dependencies using: - -```bash -mix deps.get -npm ci --prefix=assets -``` - -In case you want your videoroom to be accessible from outside of the localhost, or you are about to be developing from inside of a docker container, you need to find your machine's IP address in the network. We will refer to that IP adress as an `EXTERNAL_IP`. The way to do that is described [here](https://github.com/membraneframework/membrane_videoroom#launching-of-the-application-1). - -Then you can simply run the [Phoenix](../glossary/glossary.md#phoenix) server with the following command: - -```bash -EXTERNAL_IP= mix phx.server -``` - -If everything went well the application should be available on [http://localhost:4000](http://localhost:4000/). - -Play around...but it is not that much to do! We have better inspect what is the structure of our project. -Does the project structure reassemble you the structure of a Phoenix project? (in fact, it should!). We will go through the directories in our project. - -- **assets/**
- You can find the front end of our application. The most interesting subdirectory here is `src/` - we will be putting our typescript files there. For now, the following files should be present there: - - - **consts.ts** - as the name suggests, you will find there some constant values - media constrains and our local peer id - - **index.ts** - this one should be empty. It will act as an initialization point for our application and later on, we will spawn a room object there. - - **room_ui.ts** - methods which modify DOM are put there. You will find these methods helpful while implementing your room's logic - you will be able to simply call a method in order to put a next video tile among previously present video tiles and this whole process (along with rescaling or moving the tiles so they are nicely put on the screen) will be performed automatically - -- **config/**
- Here you can find configuration files for given environments. There is nothing we should be interested in. - -- **deps/**
- Once you type `mix deps.get` all the dependencies listed in mix.lock file will get downloaded and be put into this directory. Once again - this is just how mix works and we do not care about this directory anyhow. - -- **lib/**
- This directory contains the server's logic. As mentioned previously, the Phoenix server implements Model-View-Controller architecture so the structure of this directory will reflect this architecture. - The only .ex file in this directory is `videoroom_web.ex` file - it defines the aforementioned parts of the system - **controller** and **view**. Moreover, - it defines `router` and `channel` - the part of the system which is used for communication. This file is generated automatically with the Phoenix project generator - and there are not that many situations in which you should manually change it. - - - **videoroom/**
- This directory contains the business logic of our application, which stands for M (model) in MVC architecture. For now, it should only contain application.ex file which defines the Application module for our video room. As each [application](https://hexdocs.pm/elixir/1.12/Application.html), it can be loaded, started, and stopped, as well as it can bring to life its own children (which constitute the environment created by an application). Later on, we will put into this directory files which will provide some logic of our application - for instance, Videoroom.Room module will be defined there. - - **videoroom_web/**
- This directory contains files that stand for V (view) and C (controller) in the MVC architecture. - As you can see, there are already directories with names "views" and "controllers" present here. The aforementioned (tutorial) (the one available in the "helpful links" sections) describes the structure and contents of this directory in a really clear way so I don't think there is a need to repeat this description here. The only thing I would like to point out is the way in which we are loading our custom Javascript scripts. Take a look at `lib/videoroom_web/room/index.html.heex` file ([as the Phoenix tutorial says](https://hexdocs.pm/phoenix/request_lifecycle.html), this file should contain an EEx template for your room controller ) - you will find the following line there: - - ```html - - ``` - - As you can see, we are loading a script which is placed in `/js/room.js` (notice, that a path provided there is passed in respect to `priv/static/` directory which holds files generated from typescript scripts in `assets/src/` directory) - -- **priv/static/**
- Here you will find static assets. They can be generated, for instance, from the files contained in `assets/` directory (TypeScript's `.ts` which are in `assets/src` are converted into `.js` files put inside `priv/static/js`). Not interesting at all, despite the fact, that we needed to load `/js/room.js` script file from here ;) diff --git a/videoroom/3_SystemArchitecture.md b/videoroom/3_SystemArchitecture.md deleted file mode 100644 index a1f70e0..0000000 --- a/videoroom/3_SystemArchitecture.md +++ /dev/null @@ -1,78 +0,0 @@ -# System architecture - -Hang on for a moment! I know that after slipping through the tons of the documentation you are really eager to start coding, but let's think for a moment before taking any actions. What do we want our application to look like? -Can we somehow decompose our application? - -Sure we can - as in each web application we have two independent subsystems: - -- server (backend) - written in Elixir, one and the only for the whole system. It will spawn a `Room` process for each of the rooms created by the users, which will handle - [signaling](../glossary/glossary.md#signaling) and relay media among the peers in the room. -- client application (frontend) - the one written in form of JS code and executed on each client's machine (to be precise - by client's web browser). It will be responsible for fetching the user's media stream as well as displaying the stream from the peers. - -## We might need something else than the plain Elixir standard library... - -Ugh...I am sure till now on you have already found out that media streaming is not that easy. It covers many topics which originate from the nature of reality. -We need to deal with some limitations brought to us by the physics of the surrounding universe, we want to compress the data being sent with the great tools -mathematics has equipped us with, we are taking advantage of imperfections of our perception system... -All this stuff is both complex and complicated - and that is why we don't want to design it from very scratch. Fortunately, we have access to the protocols -and codecs - [ICE](../glossary/glossary.md#ice), [RTP](../glossary/glossary.md#rtp), H264, VP9, VP8, Opus, etc. - which already solves the aforementioned problems. But that's not enough - -those protocols are also complicated and implementing or even using them requires digging into their fundamentals. -That is why we will be using the framework that provides some level of abstraction on top of these protocols. Ladies and gents - let me introduce to you - the Membrane framework. - -## What does Membrane framework do? - -Seek at the root! [Membrane documentation](https://membrane.stream/guide/v0.7/introduction.html) - -## Membrane framework structure - -It would be good for you to know that the Membrane Framework consists of the following parts: - -- Core -- Plugins - -We will be using one of its plugins - [RTC Engine plugin](https://github.com/membraneframework/membrane_rtc_engine), which has both the server part (written in Elixir) -and the client's library (written in Javascript). This plugin provides both the implementation of the -[Selective Forwarding Unit (SFU)](https://github.com/membraneframework/membrane_rtc_engine) and the signaling server logic (with the usage of ICE protocol). - -## System scheme - -The diagram below describes the desired architecture of the events passing system which is the part of the system we need to provide on our own:
-![Application Scheme](assets/images/total_scheme.png) - -And here is how the **[SFU](../glossary/glossary.md#sfu) Engine** will relay multimedia streams:
-![SFU scheme](assets/images/SFU_scheme.png)
- -In terms of media streaming, our server will be a [Selective Forwarding Unit (SFU)](../glossary/glossary.md#sfu). -Why do we want our server to be a Selective Forwarding Unit? The reason is that such a model of streaming data -among peers allows us to balance between the server's and client's bandwidth and limit CPU usage of the server. -RTC is receiving streams from each of the peers and passes each of these streams to each of the other peers.
- -## Server - -As pointed out previously, the server will have two responsibilities - the first one is that it will work as a signaling server, broadcasting event messages among the peers. -The second one is that it will act as a streaming server. -A Selective Forwarding Unit implementation in the Membrane Framework can be achieved with `RTC Engine` plugin, which is capable of both the signaling and streaming media. -In the tutorial, we will wrap the `RTC Engine` and provide business logic in order to add video room functionalities. - -The server will consist of two components holding the logic and two components needed for communication. -The communication will be done with the use of [Phoenix](../glossary/glossary.md#phoenix) sockets and that is why we will need to define the `socket` itself and a `channel` for each of the rooms. - -The "heart" of the server will be `RTC Engine` - it will deal with all the dirty stuff connected with both the signaling and streaming. We will also have a separate `Room` process (one per each of the video rooms) whose responsibility will be to aggregate information about peers in the particular room. -`RTC Engine` will send event messages (e.g. `:new_peer` or `:sfu_media_event` messages) on which reception the `Room` process will react, for instance, by dispatching them to the appropriate peer's `channel`. `Channel` will then send those messages to the client via the `socket`. -Messages coming on the `socket` will be dispatched to the appropriate `channel`. Then the `channel` will send them to the `Room`'s process, which finally will pass them to the `RTC Engine`. RTC Engine will receive them inside its endpoints since each peer will have a corresponding endpoint in the RTC Engine, as presented on the diagram below: - -![RTC Engine](assets/images/modular_rtc.png) - -Note that the scheme is simplified and does not show [elements](../glossary/glossary.md#element) (i.e. channels) that are in between the RTC Engine and the peers' browsers -If you want to find out more about the inner architecture of the RTC Engine, please refer [here](https://blog.swmansion.com/modular-rtc-engine-is-our-little-big-revolution-in-video-conferencing-cfde806c5beb). - -Media transmission will be done with the use of streaming protocols. How this will be performed is out of the scope of this tutorial. The only thing you need to know is that RTC Engine will also take care of it. - -## Client - -Each client's application will have a structure reassembling the structure of the server. - -In the `Room` instance, the client will receive messages sent from the server on the associated `channel`. The `Room` will call the appropriate methods of `MembraneWebRTC` object. -At the same time, `MembraneWebRTC` object will be able to change the `Room`'s state by invoking the callbacks provided during the construction of this object. These callbacks as well as the `Room` object itself will be able to update the user's interface. - -Be aware that `MembraneWebRTC` object will also take care of the incoming media stream. diff --git a/videoroom/4_CreatingServersCommunicationChannels.md b/videoroom/4_CreatingServersCommunicationChannels.md deleted file mode 100644 index e874fa3..0000000 --- a/videoroom/4_CreatingServersCommunicationChannels.md +++ /dev/null @@ -1,154 +0,0 @@ -# Server's communication channel - -I know you have been waiting for that moment - let's start coding! - -## Let's prepare the server's endpoint - -Do you still remember about Phoenix's sockets? Hopefully, since we will make use of them in a moment! We want to provide a communication channel between our client's application and our server. -Sockets fit just in a place - but be aware, that it is not the only possible option. Neither WebRTC nor Membrane Framework expects you to use any particular means of communication between -the server and the client - they just want you to communicate. - -### Socket's declaration - -Socket's declaration is already present in our template. Take a quick glance at the `lib/videoroom_web/user_socket.ex` file. -You will find the following code there: - -**_`lib/videoroom_web/user_socket.ex`_** -```elixir -defmodule VideoRoomWeb.UserSocket do - use Phoenix.Socket - - channel("room:*", VideoRoomWeb.PeerChannel) - - ... -end -``` - -What happens here? Well, it is just a definition of our custom [Phoenix](../glossary/glossary.md#phoenix) socket. Starting from the top, we are: - -- saying, that this module is a `Phoenix.Socket` and we want to be able to override Phoenix's socket functions (['use' documentation](https://elixir-lang.org/getting-started/alias-require-and-import.html#use)) - `use Phoenix.Socket` -- declaring our channel - `channel("room:*", VideoRoomWeb.PeerChannel)` . We are saying, that all messages pointing to `"room:*"` topic should be directed to `VideoRoomWeb.PeerChannel` module (no worries, we will declare this module later). Notice the use of a wildcard sign `*` in the definition - effectively speaking, we will be heading all requests whose topic start with `"room:"` to the aforementioned channel - that is, both the message with `"room:WhereAmI"` topic and `"room:WhatANiceCosyRoom"` topic will be directed to `VideoRoomWeb.PeerChannel` (what's more, we will be able to recover the part of the message hidden by a wildcard sign so that we will be able to distinguish between room names!) - -The rest is an implementation of `Phoenix.Socket` interface - you can read about it [here](https://hexdocs.pm/phoenix/Phoenix.Socket.html#callbacks). - -### How does the server know that we are using the socket? - -That's quite easy - we defined the usage of our socket in `lib/videoroom_web/endpoint.ex`, inside the `VideoRoomWeb.Endpoint` module: - -**_`lib/videoroom_web/endpoint.ex`_** -```elixir -defmodule VideoRoomWeb.Endpoint do - ... - socket("/socket", VideoRoomWeb.UserSocket, - websocket: true, - longpoll: false - ) - ... -end -``` - -In this piece of code we are simply saying, that we are defining socket-type endpoint with path `"/socket"`, which behavior will be described by -`VideoRoomWeb.UserSocket` module. - -### Where is VideoRoomWeb.PeerChannel? - -It is in `lib/videoroom_web/peer_channel.ex` file! However, for now on, this file is only declaring the `VideoRoomWeb.PeerChannel` module, but does not provide any implementation. - -**_`lib/videoroom_web/peer_channel.ex`_** -```elixir -defmodule VideoRoomWeb.PeerChannel do - use Phoenix.Channel - - require Logger - -end -``` - -The module will handle messages sent and received on the previously created socket by implementing `Phoenix.Channel` callbacks. To achieve that we need to `use Phoenix.Channel`. - -Let's implement our first callback! - -**_`lib/videoroom_web/peer_channel.ex`_** -```elixir -@impl true -def join("room:" <> room_id, _params, socket) do - case :global.whereis_name(room_id) do - :undefined -> Videoroom.Room.start(room_id, name: {:global, room_id}) - pid -> {:ok, pid} - end - |> case do - {:ok, room_pid} -> - do_join(socket, room_pid, room_id) - - {:error, {:already_started, room_pid}} -> - do_join(socket, room_pid, room_id) - - {:error, reason} -> - Logger.error(""" - Failed to start room. - Room: #{inspect(room_id)} - Reason: #{inspect(reason)} - """) - - {:error, %{reason: "failed to start room"}} - end -end - - -defp do_join(socket, room_pid, room_id) do - peer_id = "#{UUID.uuid4()}" - Process.monitor(room_pid) - send(room_pid, {:add_peer_channel, self(), peer_id}) - {:ok, - Phoenix.Socket.assign(socket, %{room_id: room_id, room_pid: room_pid, peer_id: peer_id})} -end -``` - -Just the beginning - note how do we fetch the room's name by using pattern matching in the argument list of `join/3`. ([pattern matching in Elixir](https://elixir-lang.org/getting-started/pattern-matching.html#pattern-matching)).
- -What happens here? -`join/3` is called when the client joins the channel. First, we are looking for a `Videoroom.Room` process saved in the `:global` registry under the `room_id` key. -(`Videoroom.Room` module will hold the whole business logic of the video room - we will implement this module in the next chapter). -If the videoroom process is already registered, we are simply returning its PID. Otherwise, we are trying to create -a new `Videoroom.Room` process on the fly (and we register it with `room_id` key in the global registry). -If we are successful we return the PID of the newly created room's process. -At the entrance point of the following step, we already have a `Videoroom.Room` process's pid or an `:error` notification. -Errors can occur due to multiple reasons. One of them is a situation in which a race condition between peers trying to create a room takes place. -Imagine a situation, that two users are trying to join a non-existent room at the same moment. Since they are working asynchronously, there is a probability, that both of them will -get an answer from the `:global.whereis_name(room_id)` saying that the room with the given name does not exist. Both of them will then try to create such a room. The request from one of these users will come to the `:global` registry first, the room will be -registered - and the second user will receive an `:already_started` error, along with the PID of that room process, since the process already exists. Handling of that error is quite straightforward - the user can safely join the room with the provided PID. -Of course, some other errors might also occur, but we do not distinguish between them and we simply log the fact that there was a problem with the room creation. -In case we retrieve a PID of the room process, we call the `do_join/3` support function. -`do_join/3` holds some repeatable parts of code concerning the joining process. -Inside that function, we start to monitor the room process (so that we will receive `:DOWN` message in case of the room's process crash/failure). Then we notify the room's process that -it should take us (peer channel) under consideration - we send our peer_id (generated as unique id with UUID module) along with the peer channel's PID to -the room process in the `:add_peer_channel` message so that the room will have a way to identify our process. The last thing we do is that we are adding information about the association between -room's identifier, room's PID, and peer's identifier to the map of socket's assigns. We will refer to this information later so we need to store it somehow. - -Our channel acts as a communication channel between the Room process on the backend and the client application on the frontend. The responsibility of the channel is to simply forward all `:media_event` messages from the room to the client and all `mediaEvent` messages from the client to the Room process. -The first one is done by implementing `handle_info/2` callback as shown below: - -**_`lib/videoroom_web/peer_channel.ex`_** -```elixir -@impl true -def handle_info({:media_event, event}, socket) do - push(socket, "mediaEvent", %{data: event}) - {:noreply, socket} -end -``` - -The second one is done by providing the following implementation of `handle_in/3`: - -**_`lib/videoroom_web/peer_channel.ex`_** - -```elixir -@impl true -def handle_in("mediaEvent", %{"data" => event}, socket) do - send(socket.assigns.room_pid, {:media_event, socket.assigns.peer_id, event}) - {:noreply, socket} -end -``` - -Note the use of `push` method provided by Phoenix.Channel. - -Great job! You have just implemented the server's side of our communication channel. How about adding our server's business logic? diff --git a/videoroom/5_ImplementingServerRoom.md b/videoroom/5_ImplementingServerRoom.md deleted file mode 100644 index f2664b6..0000000 --- a/videoroom/5_ImplementingServerRoom.md +++ /dev/null @@ -1,235 +0,0 @@ -# Server's room - -## Let's create The Room! ;) - -We are still missing probably the most important part - the heart of our application - the implementation of the room. -The room should dispatch messages sent from RTC Engine to appropriate peer channels - and at the same time, it should direct all the messages sent to it via peer channel to the RTC Engine. -Let's start by creating `lib/videoroom/room.ex` file with a declaration of Videoroom.Room module: - -**_`lib/videoroom/room.ex`_** - -```elixir -defmodule Videoroom.Room do - @moduledoc false - - use GenServer - alias Membrane.RTC.Engine - alias Membrane.RTC.Engine.Message - alias Membrane.RTC.Engine.Endpoint.WebRTC - require Membrane.Logger - - @mix_env Mix.env() - - #we will put something here ;) -end -``` - -We will be using OTP's [GenServer](https://elixir-lang.org/getting-started/mix-otp/genserver.html) to describe the behavior of this module. - -Let's start by adding wrappers for GenServer's `start` and `start_link` functions: - -**_`lib/videoroom/room.ex`_** - -```elixir -def start(init_arg, opts) do - GenServer.start(__MODULE__, init_arg, opts) -end - -def start_link(opts) do - GenServer.start_link(__MODULE__, [], opts) -end -``` - -Then we are providing the implementation of `init/1` callback: - -**_`lib/videoroom/room.ex`_** - -```elixir -@impl true -def init(room_id) do - Membrane.Logger.info("Spawning room proces: #{inspect(self())}") - - rtc_engine_options = [ - id: room_id - ] - - mock_ip = Application.fetch_env!(:membrane_videoroom_demo, :external_ip) - external_ip = - if @mix_env == :prod or System.get_env("RUNNING_IN_DOCKER", "0") == "1", - do: {0, 0, 0, 0}, - else: mock_ip - - ports_range = Application.fetch_env!(:membrane_videoroom_demo, :port_range) - - integrated_turn_options = [ - ip: external_ip, - mock_ip: mock_ip, - ports_range: ports_range - ] - - network_options = [ - integrated_turn_options: integrated_turn_options, - dtls_pkey: Application.get_env(:membrane_videoroom_demo, :dtls_pkey), - dtls_cert: Application.get_env(:membrane_videoroom_demo, :dtls_cert) - ] - - {:ok, pid} = Membrane.RTC.Engine.start(rtc_engine_options, []) - Engine.register(pid, self()) - - {:ok, %{rtc_engine: pid, peer_channels: %{}, network_options: network_options}} -end -``` - -For the description of `engine_options` please refer to [Membrane's documentation](https://hexdocs.pm/membrane_rtc_engine/Membrane.RTC.Engine.html#content) - -We are starting `Membrane.RTC.Engine` process (we will refer to this process using `pid`) which will be serving as an RTC server. -Then we send a message to this process saying that we want to register ourselves (so that the RTC engine will be aware that we are the process responsible for dispatching the messages sent from the RTC engine to the clients). - -The last thing we do is return the current state of the GenServer - in our state we are holding a reference to `:rtc_engine` which is the id of this process and `peer_channels` - the map of the following form: (peer_uuid -> peer_channel_pid). For now, this map is empty. - -What's next? We need to handle the callbacks to properly react to the incoming events. Once again - please take a look at the [plugin documentation](https://hexdocs.pm/membrane_rtc_engine/Membrane.RTC.Engine.html#module-messages) to find out what types of messages RTC sends and what types of messages RTC expects to receive. -We won't implement handling all of these messages - only the ones which are crucial to set up the connection between peers, start the process of media streaming and take proper actions when participants disconnect. After finishing the reading of this tutorial you can try to implement handling of other messages (for instance you could turn on voice activation detection messages - `:vad_notification`, but you can read more about those in chapter 7). -Let's start with handling messages sent to us by RTC. - -**_`lib/videoroom/room.ex`_** -```elixir -@impl true -def handle_info(%Message.MediaEvent{to: :broadcast, data: data}, state) do - for {_peer_id, pid} <- state.peer_channels, do: send(pid, {:media_event, data}) - - {:noreply, state} -end -``` - -Here comes the first one - once we receive `%Message.MediaEvent{}` from the RTC engine with the `:broadcast` specifier, we will send this event to all peers' channels which are currently saved in the `state.peer_channels` map in the state of our GenServer. We need to "reformat" the event description so that the message sent to the peer channel matches the interface defined by us previously, in `VideoroomWeb.PeerChannel`. If you are new to GenServers you might wonder what are we returning in this function - in fact, we are returning the state updated while handling this message. In our case, the state will be the same so we do not change anything. `:no_reply` means that we do not need to send the response to the sender (who, in our case, is the RTC engine process). The updated state will be then passed to the next callback while handling the next message - and will be updated during the process of handling that message. And so on and so on :) - -Here comes the next method: - -**_`lib/videoroom/room.ex`_** -```elixir -@impl true -def handle_info(%Message.MediaEvent{to: to, data: data}, state) do - if state.peer_channels[to] != nil do - send(state.peer_channels[to], {:media_event, data}) - end - - {:noreply, state} -end -``` - -The idea here is very similar to the one in the code snippet described previously - we want to direct the messages sent by RTC Engine's server to the RTC Engine's client. -The only difference is that the event is about to be sent to a particular user - that is why instead of `:broadcast` atom as the second element of event's tuple we have `to` - which is a peer unique id. Since we precisely know to who we should send the message there is nothing else to do than to find the peer channel's process id associated with the given peer id (we are holding the (peer_id -> peer_channel_pid) mapping in the state of the GenServer!) and to send the message there. Once again the state does not need to change. - -There we go with another message sent by RTC engine: - -**_`lib/videoroom/room.ex`_** -```elixir -@impl true -def handle_info(%Message.NewPeer{rtc_engine: rtc_engine, peer: peer}, state) do - Membrane.Logger.info("New peer: #{inspect(peer)}. Accepting.") - # get node the peer with peer_id is running on - peer_channel_pid = Map.get(state.peer_channels, peer.id) - peer_node = node(peer_channel_pid) - - handshake_opts = - if state.network_options[:dtls_pkey] && - state.network_options[:dtls_cert] do - [ - client_mode: false, - dtls_srtp: true, - pkey: state.network_options[:dtls_pkey], - cert: state.network_options[:dtls_cert] - ] - else - [ - client_mode: false, - dtls_srtp: true - ] - end - - endpoint = %WebRTC{ - rtc_engine: rtc_engine, - ice_name: peer.id, - extensions: %{}, - owner: self(), - integrated_turn_options: state.network_options[:integrated_turn_options], - handshake_opts: handshake_opts, - log_metadata: [peer_id: peer.id] - } - - Engine.accept_peer(rtc_engine, peer.id) - - :ok = Engine.add_endpoint(rtc_engine, endpoint, - peer_id: peer.id, - node: peer_node - ) - - {:noreply, state} -end -``` - -That one might seem a little bit tricky. What is the deal here? Be aware that it is our room's process who is the only one holding the mapping between peer's id and peer channel's PID. Once a new peer joins, the RTC Engine is not aware of this peer channel's PID. That is it is asking our room process to give him some information about the new peer. -Apart from sending just peer channel's PID, the room process is also sending the identifier of a node on which the peer channel's process is located (notice that due to the use of BEAM virtual machine our application can be distributed - and server can be put on many different nodes working in the same cluster). -Later on, there comes a bunch of option definitions that will be used while defining a WebRTC endpoint. -Then we create an endpoint corresponding to the peer who is trying to join. If you are interested in the options available in the WebRTC endpoint, you can read about them [here](https://hexdocs.pm/membrane_rtc_engine/Membrane.RTC.Engine.Endpoint.WebRTC.html) but in most cases, all you would ever want to do with them is to simply copy-paste ;) -Finally, we accept the peer and add his endpoint to the RTC Engine. - -Here comes the next callback! -Once we receive `%Message.PeerLeft{}` message from RTC we simply ignore that fact (we could of course remove the peer_id from the (peer_id->peer_channel_pid) mapping...but do we need to?): - -**_`lib/videoroom/room.ex`_** -```elixir -@impl true -def handle_info(%Message.PeerLeft{peer: peer}, state) do - Membrane.Logger.info("Peer #{inspect(peer.id)} left RTC Engine") - {:noreply, state} -end -``` - -In case RTC Engine wants to communicate with the client during the [signaling](../glossary/glossary.md#signaling) process, we know how to react - we are simply passing the message to the appropriate `PeerChannel`. -How about messages coming from the client, via the `PeerChannel`? We need to pass them to the RTC Engine! - -**_`lib/videoroom/room.ex`_** -```elixir -@impl true -def handle_info({:media_event, _from, _event} = msg, state) do - Engine.receive_media_event(state.rtc_engine, msg) - {:noreply, state} -end -``` - -Again - no magic tricks there. We are receiving `:media_event` - we are sending it to our RTC engine process. -And here come the callback for a `:add_peer_channel` message: - -**_`lib/videoroom/room.ex`_** -```elixir -@impl true -def handle_info({:add_peer_channel, peer_channel_pid, peer_id}, state) do - state = put_in(state, [:peer_channels, peer_id], peer_channel_pid) - Process.monitor(peer_channel_pid) - {:noreply, state} -end -``` - -It is a great example to show how does state updating look like. We are putting into our (peer_id->peer_channel_pid) the new entry - and we are returning -the state updated this way. Meanwhile, we also start monitoring the process with id `peer_channel_pid` - to receive `:DOWN` message when the peer channel process will be down. - -We are almost done! We are monitoring all the peer channels processes. Once they die, we receive `:DOWN` message. Let's handle this event! - -**_`lib/videoroom/room.ex`_** -```elixir -@impl true -def handle_info({:DOWN, _ref, :process, pid, _reason}, state) do - {peer_id, _peer_channel_id} = state.peer_channels - |> Enum.find(fn {_peer_id, peer_channel_pid} -> peer_channel_pid == pid end) - - Engine.remove_peer(state.rtc_engine, peer_id) - {_elem, state} = pop_in(state, [:peer_channels, peer_id]) - {:noreply, state} -end -``` - -First, we find the id of a peer whose channel has died. Then we send a message to the RTC engine telling it to remove peer with given peer_id. -The last thing we do is to update the state - we remove the mapping (peer_id->peer_channel_pid) from our `:peer_channels` map. - -After all of this hard work our server is finally ready. But we still need a client application. diff --git a/videoroom/6_ImplementingClientsApplication.md b/videoroom/6_ImplementingClientsApplication.md deleted file mode 100644 index e0187e2..0000000 --- a/videoroom/6_ImplementingClientsApplication.md +++ /dev/null @@ -1,386 +0,0 @@ -# Client's application - -## Let's implement the client's endpoint! -We will put the whole logic into `assets/src/room.ts`. Methods responsible for handling UI are already in `assets/src/room_ui.ts`, let's import them: - -**_`assets/src/room.ts`_** -```ts -import { - addVideoElement, - getRoomId, - removeVideoElement, - setErrorMessage, - setParticipantsList, - attachStream, - setupDisconnectButton, -} from "./room_ui"; - -``` - -We have basically imported all the methods defined in `room_ui.ts`. For more details on how these methods work and what is their interface please refer to the source file. -Take a look at our `assets/package.json` file which defines outer dependencies for our project. We have put there the following dependency: - -**_`assets/package.json`_** -```JSON -"dependencies": { - "@membraneframework/membrane-webrtc-js": "^0.3.0", - ... -} -``` - -which is a client library for the RTC engine plugin from Membrane Framework. -Let's import some constructs from this library (their name should be self-explanatory and you can read about them in [the official Membrane's RTC engine documentation](https://docs.membrane.stream/membrane-webrtc-js) along with some other dependencies which we will need later: - -**_`assets/src/room.ts`_** -```ts -import { - MembraneWebRTC, - Peer, - SerializedMediaEvent, -} from "@membraneframework/membrane-webrtc-js"; -import {MEDIA_CONSTRAINTS, LOCAL_PEER_ID} from './consts'; -import { Push, Socket } from "phoenix"; -import { parse } from "query-string"; -``` - -Once we are ready with the imports, it might be worth to somehow wrap our room's client logic into a class - so at the very beginning let's simply define `Room` class: - -**_`assets/src/room.ts`_** -```ts - -export class Room { - - private socket; - private webrtcSocketRefs: string[] = []; - private webrtcChannel; - private displayName: String; - private webrtc: MembraneWebRTC; - private peers: Peer[] = []; - private localStream: MediaStream | undefined; - - constructor(){ - } - - private init = async () => { - }; - - public join = () => { - }; - - private leave = () => { - }; - - private updateParticipantsList = (): void => { - }; - - private phoenixChannelPushResult = async (push: Push): Promise => { - }; - - -//no worries, we will put something into these functions :) -} -``` - -Let's start with the constructor that will initialize the member fields: - -**_`assets/src/room.ts`_** -```ts -constructor(){ - this.socket = new Socket("/socket"); - this.socket.connect(); - const { display_name: displayName } = parse(document.location.search); - this.displayName = displayName as string; - window.history.replaceState(null, "", window.location.pathname); - this.webrtcChannel = this.socket.channel(`room:${getRoomId()}`); - ... -} -``` - -What happens at the beginning of the constructor? We are creating a new [Phoenix](../glossary/glossary.md#phoenix) Socket with `/socket` path (must be the same as we have defined on the server-side!) and right after that, we are starting a connection. -Later on, we are retrieving the display name from the URL (the user has set it in the UI while joining the room and it was passed to the next view as the URL param). -Then we are connecting to the Phoenix's channel on the topic `room:`. The room name is fetched from the UI. -Following on the constructor implementation: - -**_`assets/src/room.ts`_** -```ts -constructor(){ - ... - const socketErrorCallbackRef = this.socket.onError(this.leave); - const socketClosedCallbackRef = this.socket.onClose(this.leave); - this.webrtcSocketRefs.push(socketErrorCallbackRef); - this.webrtcSocketRefs.push(socketClosedCallbackRef); - ... -} -``` - -This structure might look a little bit ambiguous. What we are storing in `this.webrtcSocketRefs`? Well, we are storing references...to the callbacks we have just defined. -We have passed what method should be invoked in case our Phoenix socket is closed or has experienced an error of some type - that is, `this.leave()` method. We will define this method later. -However, we want to keep track of those callbacks so that we will be able to turn them off ("unregister" those callbacks). -Where will we be unregistering the callbacks? Inside `this.leave()` method! - -Now let's get back to the constructor. Let's initialize a MembraneWebRTC object! - -**_`assets/src/room.ts`_** -```ts -constructor(){ - ... - this.webrtc = new MembraneWebRTC({callbacks: callbacks}); - ... -} -``` - -According to MembraneWebRTC [documentation](https://docs.membrane.stream/membrane-webrtc-js/interfaces/callbacks.html) we need to specify the behavior of the RTC engine client by the mean of passing the proper callbacks during the construction. - -We will go through the callbacks list one by one, providing the desired implementation for each of them. All you need to do later is to gather them together into one JS object called `callbacks` before initializing `this.webrtc` object. - -## Callbacks - -### onSendMediaEvent - -```ts -onSendMediaEvent: (mediaEvent: SerializedMediaEvent) => { - this.webrtcChannel.push("mediaEvent", { data: mediaEvent }); -}, -``` - -If `mediaEvent` from our client Membrane Library appears (this event can be one of many types - for instance it can be message containing information about an [ICE](../glossary/glossary.md#ice) candidate in a form of [SDP](../glossary/glossary.md#sdp) attribute) -we need to pass it to the server. That is why we are making use of our Phoenix channel which has a second endpoint on the server-side - and we are simply pushing data through that channel. The form of the event pushed: `("mediaEvent", { data: mediaEvent })` is the one we are expecting on the server-side - recall the implementation of `VideoRoomWeb.PeerChannel.handle_in("mediaEvent", %{"data" => event}, socket)` - -### onConnectionError - -```ts -onConnectionError: setErrorMessage, -``` - -This one is quite easy - if the error occurs on the client-side of our library, we are simply setting an error message. -In our template `setErrorMessage` method is already provided, but take a look at this method - `onConnectionError` callback forces us to -provide a method with a given signature (because it is passing some parameters which might be helpful to track the reason of the error). - -### onJoinSuccess - -We will manipulate the list of peers in this method. -Here is `onJoinSuccess` callback implementation: - -```ts -onJoinSuccess: (peerId, peersInRoom) => { - this.localStream!.getTracks().forEach((track) => - this.webrtc.addTrack(track, this.localStream!) - ); - - this.peers = peersInRoom; - this.peers.forEach((peer) => { - addVideoElement(peer.id, peer.metadata.displayName, false); - }); - this.updateParticipantsList(); -}, -``` - -Once we have successfully joined the room, we make `MembraneWebRTC` object aware of our `this.localStream` tracks (do you remember that we have audio and video track?). -Later on, we are adding a video element for each of the peers (we want to see a video from each of the peers in our room, don't we?). -The last thing we do is invoking the method which will update participants list (we want to have the list of all the participants in our room to be nicely displayed) - let's wrap this functionality into another method: - -```ts -private updateParticipantsList = (): void => { - const participantsNames = this.peers.map((p) => p.metadata.displayName); - - if (this.displayName) { - participantsNames.push(this.displayName); - } - - setParticipantsList(participantsNames); -}; -``` - -We are simply putting all the peers' display names into the list and later on, we are adding our own name on top of this list. The last thing to do is to inform UI that the participants' list has changed - and we do it by invoking `setParticipantsList(participantsNames)` from `assets/src/room_ui.ts`. - -How about you trying to implement the rest of the callbacks on your own? Please refer to the [documentation](https://docs.membrane.stream/membrane-webrtc-js/interfaces/callbacks.html) and think where you can use methods from `./assets/src/room_ui.ts`. -Below you will find the expected result (callback implementation) for each of the methods - it might not be the best implementation...but this is the implementation you can afford! -Seriously speaking - we have split some of these callbacks implementation into multiple functions, according to some good practices and we consider it to be a little bit...cleaner ;) - -### onJoinError - -```ts -onJoinError: (metadata) => { - throw `Peer denied.`; -}, -``` - -### onTrackReady - -```ts -onTrackReady: ({ stream, peer, metadata }) => { - attachStream(stream!, peer.id); -}, -``` - -### onTrackAdded - -```ts -onTrackAdded: (ctx) => {}, -``` - -### onTrackRemoved - -```ts -onTrackRemoved: (ctx) => {}, -``` - -### onPeerJoined - -```ts -onPeerJoined: (peer) => { - this.peers.push(peer); - this.updateParticipantsList(); - addVideoElement(peer.id, peer.metadata.displayName, false); -}, -``` - -### onPeerLeft - -```ts -onPeerLeft: (peer) => { - this.peers = this.peers.filter((p) => p.id !== peer.id); - removeVideoElement(peer.id); - this.updateParticipantsList(); -}, -``` - -### onPeerUpdated - -```ts -onPeerUpdated: (ctx) => {}, -``` - -Once we are ready with `MembraneWebRTC`'s callbacks implementation, let's specify how to behave when the server sends us a message on the channel. -We need to implement an event handler: - -**_`assets/src/room.ts`_** -```ts -constructor(){ - ... - this.webrtcChannel.on("mediaEvent", (event) => - this.webrtc.receiveMediaEvent(event.data) - ); -} -``` - -Once we receive `mediaEvent` from the server (which can be, for instance, a notification that a peer has left), we are simply passing it to the `MembraneWebRTC` object to take care of it. - -Now we have the `Room`'s constructor defined! But we cannot say that all the operations allowing us to connect to the server have been performed inside the constructor. - -Further initialization might take some time. That's why it might be a good idea to define an asynchronous method `join()`: - -**_`assets/src/room.ts`_** -```ts -public join = async () => { - try { - await this.init(); - setupDisconnectButton(() => { - this.leave(); - window.location.replace(""); - }); - this.webrtc.join({ displayName: this.displayName }); - } catch (error) { - console.error("Error while joining to the room:", error); - } -}; -``` - -First, we are waiting for `this.init()` to complete. This method will be responsible for initializing media streams. -Then we are setting up the disconnect button (which means we are making the button call `this.leave()` once it gets clicked). -Later on, we are making our MembraneWebRTC [`join()`](https://docs.membrane.stream/membrane-webrtc-js/classes/membranewebrtc.html#join) the room with our display name. - -Let's provide the implementation of `this.init()` used in the `this.join()` method. -As noticed previously, this method will initialize the user's media stream handlers. -This is how the implementation of `this.init()` can look like: - -**_`assets/src/room.ts`_** -```ts -private init = async () => { - try { - this.localStream = await navigator.mediaDevices.getUserMedia( - MEDIA_CONSTRAINTS - ); - } catch (error) { - console.error(error); - setErrorMessage( - "Failed to setup video room, make sure to grant camera and microphone permissions" - ); - throw "error"; - } - - addVideoElement(LOCAL_PEER_ID, "Me", true); - attachStream(this.localStream!, LOCAL_PEER_ID); - - await this.phoenixChannelPushResult(this.webrtcChannel.join()); -}; - -``` - -In the code snippet shown above, we are doing a really important thing - we are getting a reference to the user's media. `navigator.mediaDevices.getUserMedia()` method is an -asynchronous method allowing the browser to fetch tracks of the user's media. We can pass some media constraints which will limit the tracks available in the stream. -Take a look at `assets/src/consts.ts` file where you will find `MEDIA_CONSTRAINTS` definition - it says that we want to get both audio data and video data (but in a specified format!). -Later on, we are dealing with the UI - we are adding a video element to our DOM. -Due to the fact that we need to distinguish between many video tiles in the DOM, we associate each of them with an ID. -In case of this newly added video element (which will be displaying the stream from our local camera) the ID is a `LOCAL_PEER_ID` constant. -We specify that we want our local stream to be displayed in the video element with `LOCAL_PEER_ID` identifier by using `attachStream()` method. -The last thing we do here is that we are waiting for a result of `this.webrtcChannel.join()` method (calling this method will invoke `VideoRoomWeb.PeerChannel.join()` function on the server side). -`this.phoenixChannelPushResult` is simply wrapping this result: - -**_`assets/src/room.ts`_** -```ts -private phoenixChannelPushResult = async (push: Push): Promise => { - return new Promise((resolve, reject) => { - push - .receive("ok", (response: any) => resolve(response)) - .receive("error", (response: any) => reject(response)); - }); -}; -``` - -Oh, we would have almost forgotten! We need to define `this.leave()` method: - -**_`assets/src/room.ts`_** -```ts -private leave = () => { - this.webrtc.leave(); - this.webrtcChannel.leave(); - this.socket.off(this.webrtcSocketRefs); - this.webrtcSocketRefs = []; -}; -``` - -What we do here is that we are using methods aimed at leaving for both our MembraneWebRTC object and Phoenix's channel. Then we are calling `this.socket.off(refs)` method ([click here for documentation](https://hexdocs.pm/phoenix/js/#off)) -\- which means we are unregistering all the callbacks. The last thing we need to do is to empty the references list. - -Ok, it seems that we have already defined the process of creating and initializing `Room` class's object. -Why not create this object! Go to `assets/src/index.ts` file (do you remember that this is the file which is loaded in template .heex file for our room's template?) -Until now this file is probably empty. Let's create `Room` instance there! - -**_`assets/src/index.ts`_** -```ts -import { Room } from "./room"; - -let room = new Room(); -room.join() -``` - -The first thing we do is to import the appropriate class. Then we are creating a new Room's instance (the `constructor()` gets called). -Later on, we are joining the server (which might take some time as it needs to get access to the user's media - that is why this method is asynchronous). -That's it! We have our client defined! In case something does not work properly (or in case we have forgotten to describe some crucial part of code ;) ) -feel free to refer to the implementation of the video room's client-side available -[here](https://github.com/membraneframework/membrane_videoroom_tutorial/tree/template/end/assets/src). - -Now, finally, you should be able to check the fruits of your labor! -Please run: - -```bash -EXTERNAL_IP= mix phx.server -``` - -Then, visit the following page in your browser: -
-[http://localhost:4000](http://localhost:4000) -
-and then join a room with a given name! -Later on, you can visit your video room's page once again, from another browser's tab or from another browser's window (or even another browser - however the recommended browsers to use are Chrome and Firefox) and join the same room as before - you should start seeing two participants in the same room! diff --git a/videoroom/7_FurtherSteps.md b/videoroom/7_FurtherSteps.md deleted file mode 100644 index e71d700..0000000 --- a/videoroom/7_FurtherSteps.md +++ /dev/null @@ -1,51 +0,0 @@ -# Further steps - -We can share with you inspiration for further improvements! - -## Voice activation detection -Wouldn't it be great to have a feature that would somehow mark a person who is currently speaking in the room? That's where voice activation detection (VAD) joins the game! - -You can turn VAD on by changing some fields while creating an endpoint on the server: - -**_`lib/videoroom/room.ex`_** -```elixir -@impl true -def handle_info(%Message.NewPeer{rtc_engine: rtc_engine, peer: peer}, state) do - # ... - endpoint = %WebRTC{ - rtc_engine: rtc_engine, - ice_name: peer.id, - extensions: %{ - opus: Membrane.RTC.VAD - }, - webrtc_extensions: [ - Membrane.WebRTC.Extension.VAD - ], - owner: self(), - integrated_turn_options: state.network_options[:integrated_turn_options], - handshake_opts: handshake_opts, - log_metadata: [peer_id: peer.id] - } - # ... -end -``` - -RTC will now start sending messages looking like this: ```{:vad_notification, val, peer_id}``` (`val` is either `:speech` or `:silence`) to clients whenever someone starts or stops speaking. We need to simply pass this message from RTC engine to the client's application and take some actions once it is received - for instance, you can change the user's name displayed under the video panel so that instead of the plain user's name (e.g. "John") we would be seeing "\ is speaking now" message. -Below you can see what is the expected result: - -![VAD example](assets/records/vad.webp "VAD example") - -Hopefully, you will find the diagram placed below helpful as it describes the flow of the VAD notification and shows which component's of the system need to be changed: - -![VAD Flow Scheme](assets/images/vad_flow_scheme.png "VAD flow scheme") - -## Muting/unmuting - -It's not necessary for each peer to hear everything... -Why not allow users of our video room to mute themselves when they want to? -This simple feature has nothing to do with the server-side of our system. Everything you need to do in order to disable the voice stream being sent can be found in [WebRTC MediaStreamTrack API documentation](https://developer.mozilla.org/en-US/docs/Web/API/MediaStreamTrack). You need to find a way to disable and reenable the audio [track](../glossary/glossary.md#track) from your local media stream and then add a button that would set you in a "muted" or "unmuted" state. The expected result is shown below: -![Mute example](assets/records/mute.webp "mute example") - -You can also conduct some experiments on how to disable the video track (so that the user can turn off and on camera while being in the room). - -Here our journey ends! I modestly hope that you have enjoyed the tutorial and have fetched out of it that much interesting information and skills as possible. Or maybe you have even found yourself passionate about media streaming? Goodbye and have a great time playing with the tool you have just created! diff --git a/videoroom/assets/diagrams/SFU_scheme.drawio b/videoroom/assets/diagrams/SFU_scheme.drawio deleted file mode 100644 index 3d0a877..0000000 --- a/videoroom/assets/diagrams/SFU_scheme.drawio +++ /dev/null @@ -1 +0,0 @@ -7Ztbc5s4FMc/jR/rAYmbHxsn2c7OdiY7aaftowwyaBYjL5Zju5++khEXcTEuAWzvOg8ZdEBCnP+Pw9EBT+B8tf8jRuvgM/VwOAGat5/AxwkADtT4f2E4JAbTAInBj4mXmPTc8Ep+YmmU/fwt8fBGOZBRGjKyVo0ujSLsMsWG4pju1MOWNFTPukY+rhheXRRWrd+IxwJ5WcDO7Z8w8YP0zLo1S/asUHqwvJJNgDy6K5jg0wTOY0pZsrXaz3EofJf6Jen33LA3m1iMI3ZOh8//Lr/sPs2Ddbj4cxPtXfT38+6DHOUNhVt5wa/PX7nhKfJJhOXE2SH1xi4gDL+ukSvaOy74BD4EbBXyls430WadaLAke8xP+7AkYTinIY2P3SE2dUPThJ1GrGBfHv+4fcNi+g8u7HkAmnbsIeeJY4b3jQ7QM7dyHDFdYRYf+CGyA7SkEhJFYMj2riCsNAUFTVMbkij52ci5t/mGdPhvOB9UnF9xuLhgwmn8Cy1w+EI3hBEa8V0Lyhhdqd5Pj/0YEl8cw6iQB8mWy72E4zofW66DF8uqWh7CztIVPQK0FvNZ7X1xj0/Rz22Mpy5drbfJkGtKxOBPb/wcGzmbHgTTbVUwHVQFs2sEs8yBBIPXIZhnLSzTqgrG7yHgXlIw02kXbNQ7zLgOwRywgFaNYJ6JHc+4IsEMvSrYrEaw2VCCWdch2MIxDVOrucMcF1/0DtOtdsHq7jB7KMGcdsFw5H0UmZjweIg2G+KqIsV0G3kiXXgUHueOiA/f5a5j40dxz+NeaR3S1p6w74XtY5+pKVt5J9Eo9nnBMeGOEJI91umdRdTGGJzsSXNDmMmMvUpiWRKZO4luYxe3P3AYin3M2tK4KjQFKMwaKFJbjEPEyJs63TpS5BleBNuFvGqmMgn1EmzJZcpexQy1NJBhlBI0qzRQ4ofKQEdws8vuzvJsHJb1c1meakDBOWt25jl/4DQ+ogbl2bjzPCLP6bwHAFrrBvTM6Rno/IHc+AgfFGjrDvSYQFfLFYMA3ZZtNC6UG5fW74HwuuECJSb0cqp5M3CdUY7pCtcUmApfU9tsi5mi1fY0vzB38JLcVXC5We7OqCp15k5dRE0t6PTA3cAP3RvjzjS7cWfatggDJ2Pn0OidUR/rjB60SzFv1kfIG3YBc2PoOf2hB7SR0RuucMTXFcfKTAE9oLeuUergyxcudlpKytY7jauWK1uhnEu2cUmy/zsrlN5rSGPmfBcLbFpJftBR/sr7DHNc+dN5DxPVICxnc/B9UQ10j2oXLozfo9qoWPded6nCUQDdVjHXdONdmCslR/v0O6Hfx/zaY2v5qxeogW4QAt1SY6sORD13VA57L9Gc5FBdv7wv1Go6VDg0HeckiP28zLlAxLXuEbcP0nsvCp0kHdj9sf5BwK6+uDQy+LvD3p73/u9gL3/PWP6M6mZgN0eEvVQR7VSHb8igtZOQD5da9Ag0uEfvPoA+4zO2/oCeamCmQg21TmX+HOvOVF9nTnKnuheq7RGorq+4nmawKbbrY73tuiPdEemsoxyonMF0Jpo381/WJIfnP0+CT78A \ No newline at end of file diff --git a/videoroom/assets/diagrams/VAD_flow_scheme.drawio b/videoroom/assets/diagrams/VAD_flow_scheme.drawio deleted file mode 100644 index 0d86579..0000000 --- a/videoroom/assets/diagrams/VAD_flow_scheme.drawio +++ /dev/null @@ -1 +0,0 @@ -5Vpbc6M2FP41ntl9sAcQYPIYX5p2JtvJJNtk85SRQRg1GFGQY7u/fo9AYC6yQxITd6d5iKUj6SDOd+72AE1X26sEx8E35pFwYGjedoBmA8NAhmbCh6DscoqOxlZOWSbUk7Q94Y7+SyRRk9Q19Uha28gZCzmN60SXRRFxeY2Gk4Rt6tt8FtafGuMlaRHuXBy2qQ/U40FOdYzxnv47ocugeLJuX+QrK1xslm+SBthjmwoJzQdomjDG89FqOyWhkF4hl/zcbwdWy4slJOJdDnj364cE0fXVpfbyaDxdbcbm01ByecHhWr7wN7JaJDgio9vv09E8WtKIyPvzXSEUYAzyh8lkE1BO7mLsipUN6ADQAr4KYabDEKdxDopPtwTuMfFpGE5ZyJKMESKWbmoa0FOesGdSWZkYmpat+CziFbqf/QFdXpwknGwPSkQv5QwaStiK8GQHW+QB60JCI5XzQoK62QMN18tpQRXkgoilci1L1nv5w0BC8BY4tLasPdBHOWUJD9iSRTic76mThK0jT0h3JuS133PNWCxx+JtwvpPGhdec1VEiW8p/yONi/CjGI0vOZtvK0mwnJ/k9xeWOCx/eha0Tlxx5aUOaNU6WhB/Zh9RgJiTEnL7U73FyZIyWodyDY2JgvqvRLfxr4fZBywAlN1xXZRmevbAt+zQWoI/rFqAXClg1Ad1UmEBBPL0JtF1SvE6DX8UuQPDJ7kfBQEwqp8R0fyyb9WBPqKM9mee0J6S2J2FKD2QxuiEkmQYYYnrYQr6O6yt21jArDxPHV5qV7TpkcaLAMnQaZoUUkcVRRBa7r8BiHhf3XylJ7pj7DPpySmn7jkvUTmzhWKalnUjayP6vidtqibvtvyLvUuSnMHNDnKbUrYuyLvf3upUzh/YiY3/NFRlqgCv4WQr4ClpnjyWfcMMovEipP2WGLtXHtBpqkb+mPFXNsJuMGtFUpLY1RrkcWowyFStf+/1aN1ZonR2CsCYefalpn/3PWpQdEze3yEshxOXii6GL8wMDHq81xl/FROiGtsDu8zLTzmHjONLKE7Xh/qxI5oc+XtFwlx+Dq+BVnC0iZMLnikQha1FLZu39ENHdtxxwAUxKEliKyObwOeDL0szP1e6eZrmCuLluxNtyLYQqaVioZrbqiNVCzDBaZp/jSYEI8I66QOLY5ZV0y64gYn5V8EeXL9h7ihinPnXBJlhUPA8UKn+k3JlxAU3JBzGE2yfQrfGssj9TmmK7Ivu6xguIzvWsNqTLSHg08BQgYzQRjh1uEl7KhRX1vDw5IyBKvMj4CSHGwigyM7EmA2t2LDLIUl8eHpTmW3VQh73y4TCijQxds+qR5CQOZqhf1LgODbvOgvl+SnpxCXq7bjkWiSIWCXA8nAZZ6NHr8Ar6DeYAbZRRwLjL4F50R4xW7OoaT87l/4d2ow+A9HcGgPFrjHoOAHo7zSvsWXgvpb9puDVTUzmuu/nt/fy24htydgecAxgsrytOPf+TWlZNFiWpu/9QJaIqtftYZmnrjcxSa2eWtkIzUW8NIlVmeQqEp9d/zP/8/j9EuJH8oSKHOxvCFx9G2NCVyYfksigIeduqBHzR3PiaErytMgROstGio3O1u4Zlal+gbbfRdsYKuA2nL7yNj1u0bqvwjgNGIrodFXV9v6Aa5yv/TVRP2sqyqwIqUuUPen+gjttZ1uf1K4txl2K/CWHXZK1L8e8MOlb/BwD+pM5+2+H6FMoTJYI91zxVL2lnKsHzKu6VgPfukihX1IO2pY20CwOdJMtuZFJDo2F7PdZATgvgwjf+wi1m1Ahlpuqrm09tehYvoYhk1dbDiEVfREcILCQUvaSDHYpTRqYTSBx1kLfxmfLWbYW8G0JLAxyLoR+SrSzyJ2/rPFckadY6ymVgeayEnN47yge+tPqkTkHTi7XSwq6dgmF/vWKY7n9Ukm/f/zYHzX8C \ No newline at end of file diff --git a/videoroom/assets/diagrams/client_flow1.drawio b/videoroom/assets/diagrams/client_flow1.drawio deleted file mode 100644 index ef5278b..0000000 --- a/videoroom/assets/diagrams/client_flow1.drawio +++ /dev/null @@ -1 +0,0 @@ -1Vrfc5s4EP5rPNc+xIPED8Oj7aS5m0lvOk16aR4FyIYGIyJkx+5fX8kIg0A4joN9SV6CFmkR++1+u1o8MKeL9TVFWfSVhDgZQCNcD8zLAYTAgg7/JySbQjISIyGY0ziUkyrBbfwbS6Ehpcs4xLkykRGSsDhThQFJUxwwRYYoJc/qtBlJ1KdmaI5bgtsAJW3pfRyyqJC6cFTJ/8bxPCqfDByvuLNA5WT5JnmEQvJcE5lXA3NKCWHF1WI9xYkwXmmXYt2Xjru7jVGcskMW5Kub+7vr8fTJ23xJl0+ryb93zoUEY4WSpXzhAXQSrm8yI1wt33WJiPO0FDudgOqSX83l/+0SX8xnm0RdIBRdFGrGfIJpZOu2iu+ELEo1/BX8pmouKzZUiqHyLEjJMg2xeFOD336OYoZvMxSIu8/cMbksYouEj4DckfQ0YIpxnCRTkhC61WXOZjMYBFyeM0oece1O6PiO7eyeX7e9hGOFKcPrmkhicY3JAjO64VPkXVe6xab0dzl+rrwMGGUURDUX261E0rXnO90V+vxCOsArnMF80RleAhdqwc0igtN4PbwlwSNmp4YUAg2kboD1kPqubdlGP5B6tgqp1YbUtDWIgpMharWsikNOb3JIKIvInKQouaqkE9Xu1ZwbQjJp7V+YsY00N1oyomKB1zH7Wbt+EKqGthxdrqXm7WAjB038OtHIyZIGeM8ry3zDEJ1jtmeeJHBhj73YUpwgFq/UdNA7UHYr9GYx5XlPB98N8nmSVUyOknie8uuAmwtzz54Ip415FhvLG4s4DAt0MY9W5G/1CcNnJE7Z9m3sycC+bLCjs/UHxg1AUrmgjcw+z2sFzy5Tyz0oyVAXVMbQ8KBkpoOxkNq+iZerVAGVci9gI+zIbJYLfmpgudvT8fA6b2ZWYO9j1mmEuFGT01CrSqWhjd3Q0lGpC33T6Sk7gmZ6tOx2enQ1XOqcikqh91G4lFudbn7WB8UqaJfjat12pCFhAN9EwqMDSdh9VyRsGq0wzXEaimIehzESG11hwZbQQQsRJamfZ7r4OjNL67Ha68V98DI03JESpFYvLH0BPJWmQUPF6WjaBFoozxTjVVw/1O68LsbfX4iX5/33EuPtE2/IK62AJWIPyyxEDH+geDa7Musx8ew4ajyX3YE3BrTZLLvgueJ51Fl25RlKDyq7oK7sCopy6y+eCYwhST993nVA6G4KShIfBY95rSYrHnqK424PFZc1alZcmoYE1FRcox4qLvs/8tWfXkYTP759MMe/fpOHQNOb6j4TnYuilTJsdGAdBvrg6AZ/Hkfabpu0tabf7yqcKjzLaiRp4xiuGFOKNrUJkhc7qcSFqpNCy2i4WaGxsfr0TON2Ms3bWmdf8cKnKMX32P9+N20Xnqc/72GbM0HplmqXlP/pzoETaBhGTy21JuDAbLPS7sx3np6ap4G6Ye88Qpm4nCV4PRYfIbYxHsrLyyBBeR4HqtlVjKr6buiZXp0/LoyhYZovUYiKoQ/CcGbosALGyPRwuf4bpjE3kqhlOno94MBirgaPruNpH8UWLTqAI6h6RwPzgg/logp2nZ6h0VBlmUMDeNWfq6ouKLSl+rWE1nwDx3hpo8r00R7664vaYLs3yetlzFHLyxPxthDK8XanMyq+5PAR5VGuzdXvs5Lu7WDMA9R0TAUmuxdvB4b6faHhkadLbiXD1Tzgxz98/GkVh5iIeiJOhDNMBdYZRo+YCodI4lzsHrNg+PmNKakGqd1TYrHVxGKBA8tdcLoOY/v72w8eRtvgkg2o5imjREBYHi3DmLw/Q3uj/9PQ2urW1tLSB+ryHNnJfe0RovtocHDf5wxtHu0u2x9citbOO+7dHpx5VNy6PXxfhnIcF6j1Tj8Fmdq6Bc4JUtQ+r6zh3To5dTdo+mLMvVi2Yq0THsdqHHIPZcwjCJMPq18gFWBUv+Myr/4A \ No newline at end of file diff --git a/videoroom/assets/diagrams/client_flow2.drawio b/videoroom/assets/diagrams/client_flow2.drawio deleted file mode 100644 index abdf063..0000000 --- a/videoroom/assets/diagrams/client_flow2.drawio +++ /dev/null @@ -1 +0,0 @@ -1Vpbc+MmFP41nrYP8Qh0f4ydbNqZpN1uspPdpw6SkEUjGwVhx+6vL1jogiRfUsve1A8JHC5CfOd85wN7ZE7n6zuGsuSBRjgdQSNaj8ybEYTAgo74Jy2bwuLKmjTMGIlUp9rwSP7Bymgo65JEONc6ckpTTjLdGNLFAodcsyHG6JveLaap/tQMzXDH8BiitGt9JhFPCqsH3dr+KyazpHwycPyiZY7KzupN8gRF9K1hMm9H5pRRyovSfD3Fqdy8cl+KcZ92tFYLY3jBjxmQr+6fn+6up6/+5tNi+bqa/P7kXCkwVihdqhceQScV801iKqYVqy4RcV6XcqUTUBdFaab+b4cEsj/fpPoAOdFVMc216GAa2bo7xRdK5+U04hWC9tTCViyoNEPtWZDR5SLC8k0N0fyWEI4fMxTK1jfhmMKW8HkqakCtSHkaMGWdpOmUppRt5zLjOIZhKOw5Z/QFN1oiJ3Bsp3p+c+8VHCvMOF43TAqLO0znmLON6KJaPeUWm9LfVf2t9jJglFGQNFysGomUa8+quWv0RUE5wDucwTzoDIfAhb3gZgnFC7IeP9LwBfNzQwpBD6ReiPshDTzbso1hIPVtHVKrC6lp9yAKzoaoczKiwN6H6DRBgnbT80CqQxjZ2IusPgg9GJjOQFEJ2mFp2d2w9HowdM4FIYCdbcSRSFGqShlP6IwuUHpbWyf6Rtd97inN1Pb+jTnfqJBBS071zcdrwr+p4bL8XZbH0FbVm3Wj7WZTVhbihb+VM8hKMcwuq/Wwba0ctxM3+aJ7UWM4RZys9FzdB4Ea+pmSrduXHAxbaLvu2NYnyemShViNa2bX1lSuqU9lgpZDcMRmmHcmumYMbRrdMtkhP37J0IFqyTtX5p46QryLPkIUinXXbl2B8t89vYrWmq1iwnDe6//3KBC8o/ksSslsIcqhcCMsuGEiw54IKXetGuYkiorwwILgULCdTzqg2nQxuT0Z2TctiWD3uujeYO0QTiVQ1VM1DdhHRMYYOJ7yxRO9HOhpqVpkOQON4xzzUZuoBkDUO5OgeMDzgKEFfsbBlyexNAfNZfJYBHl2Ns2oZyNsi+xuqPXq2lF8+rLUBBqGMZDQ6PCW2VUaVUa6jNIonewjpKn9WWrn/hdUv+8dywPolsoPy67h0tdp0HRVYLbME0msxhxHBG1hYxjN5ZpXeJt/xClb/kmwpK5S5V2eh4/lXXso3r0yxqbhAZ0x4SA8fGWZLSJuaY1BiBjH8Z/8IYwzdM9ebiH7I73/6wr0Ynep4KwC8nuj5YCErCrHSchmvoaV37QC8MxRX9xsXSDKeyHu3uQsswjxHy+fajiOjk4dqN0evS+KhWz1Bwlbs5UyzyeXQFcAd5ROcdHGSlEj4EkDFL50UX6XuhlAkzigdcxwey60YI8mce3diJyW9/rus1q7lCcok8U4xetreV28ZZZIFW/CFOU5CfXN0ne2ZqyapCqe28dYuqAMQBTFRp9wBIZr+ngfSAeZyTtSjzSg6runKm0nBpRv655iG60IO/bMbVntk+2pZ+6hIxpaHRfMhXvtE17/n7NuGV6DaC7T8XRxZA3ja05LcrmXIvOSbBrQf/1N1H9ekQhvtTVJZXKeSrwzjF4wk7I7JblcPubh+JfTOP0wrO/m+PbFkN33pUUfx4OzXY/CLsl/zTH7qXG4aefMEgG582gZEfrxNtrzP95G932XMHg2bQvH+gwBYFmX2VVQhiFied9R4v0JVg7/jBkRGyY59cS76cskU+j4Y98x6o+lOQ6wzLHvNZpbd3/HplrYoucjb7cHI1O3N4/mOy4sBAGIWPvRuRQen0ur4Brg3hh6nq8nvWGuL1x91iFukUW1/v1D0b3+FYl5+y8= \ No newline at end of file diff --git a/videoroom/assets/diagrams/server_scheme.drawio b/videoroom/assets/diagrams/server_scheme.drawio deleted file mode 100644 index a3cf766..0000000 --- a/videoroom/assets/diagrams/server_scheme.drawio +++ /dev/null @@ -1 +0,0 @@ -7Ztdc6IwFIZ/jZftQEIQL1dtuxfbrbOdnd1eIkRhisYJsdr99RskUSBYrR8EK70pOUnAvOfxkJPEFuxNlg/UnQWPxMdRCxj+sgX7LQBMC9j8X2J5Ty0OQKlhTENfNNoYnsN/WBgNYZ2HPo5zDRkhEQtneaNHplPssZzNpZQs8s1GJMo/deaOsWJ49txItf4JfRbIUbQ39u84HAfyyabdSWsmrmwsRhIHrk8WGRO8a8EeJYSlV5NlD0eJeFKXtN/9ltr1B6N4yvbpMBig2Zh0O0bgoR/L19D+Oby7EXd5c6O5GPCvp6dHbhEVMXuXSiyCkOHnmesl5QX3dgt2AzaJeMnkl248S/UfhUvMH9kdhVHUIxGhq+5wNBoBz+P2mFHyijM1vj20kc1r1BHJj4cpw8uMSYzwAZMJZvSdNxG1llRb4GbaorzIOE+YgozfpM0VuIzXd94oyi+EqJ8QGGwTGFyowDAvsNXWLDBUBH6+/10KML8Zjxq80D1SaYxMyzASO5myvAf4X5kHusAwVj1O4AFk1wxxa4sHVMK/igfaNfsOIMUDA4xpL3D5OzEq/S5QMp/6ibJ9Y7c3KGEuC8mUF6GhOsNH2PGtMtEdMIT2iQIPsArYG6roJihR3TqX6vYO1VX+D1Vdn+jF16mjW/T2DtHhyUS/MZE22QsvWYB0y+5cYYSxtEeYzvVFGEt7hJGZ53WFGFQyi6xYd3UmrwiNp/63JKnnJS9y4zhM8x2XMtWckzzrH7wM2V9Rk1y/JPZbJEr9ZaZZ/10WpnyIf7OFl2xh02lVkr22+ikmc+rhD8QQ8ZaPbIzZB+2ERNjPrWKoXs94FZU4Vdoojjibb/m1jzJPiycMSMhHtoYKFqCCnQIs6bhFr+xaRfFGTuFGRepSYZQbrcBbD/sIFtWcpvYs3lr8G/Qxj6vSANOQ64Pp0ZB2rhvS9fxAhlCjYkjVtK/+kLbR2WKmfHFfK4/ag6aaEJ+Jxy1s7ULrEI4PxxHsSaPV0HgWGtWVgmumUe6d7aIR1ovGwlo3bB9IIyqkmeu0syoa1QWUamnc+d6tFsf2njia9cIRnAhHWNhBgFXjqK4sXXNwtBsaddIoH6eNRhPs5LEsc64WUtRAqhVS9ahMTLxXnHzEI3c/igc5HA+XH+QYOshCJ9rE7uTl1L78C9TVjLW+ajiov76F9Ef/zilQ0/O1wEduaWgRuBAPQMkhjIoFVjPOi44QxQRev8BqEnXRIaJ4jEh/iFDTgq8UIpB2gaHumW698i6w76qUmBs0c9rTzmmhOqe9ahz3TbHsBsez4AgaHEsm7DtxbDc4ngXHyg4kSRzNw3A0KsFx7yNJNcPxq2xowsrOJF0EjnsfPnIaHM+CY2Wnjy4Cx/0PH3UaHj/HIy9ufiWbNt/81hje/Qc= \ No newline at end of file diff --git a/videoroom/assets/diagrams/total_scheme.drawio b/videoroom/assets/diagrams/total_scheme.drawio deleted file mode 100644 index 5444338..0000000 --- a/videoroom/assets/diagrams/total_scheme.drawio +++ /dev/null @@ -1 +0,0 @@ -5Vtbc5s4FP41ntl9SAZJiMtj7KTbnWmnmSadNo8yyDYtRh5QEqe/fgWIixA22AYnzbozDTogAed856rDBM3W239isll9Zj4NJ9DwtxN0PYEQAtsRf1LKS05xIcgJyzjwc1KNcBf8ppJoSOpj4NNEuZAzFvJgoxI9FkXU4wqNxDF7Vi9bsFC964YsqUa480ioU78HPl/lVAfaFf0jDZar4s7AcvMza1JcLN8kWRGfPddI6GaCZjFjPD9ab2c0TJlX8CWf92HH2fLBYhrxPhNm05+fbv3pF+rdf/k8fwivfnPvAuarPJHwUb7wHY2faCwfmb8UfEieg3VIIjGaLoIwnLGQxdkZ5BPqLDxBT3jMftHaGctz6Hwhzsib0JjT7c6nByVPBJgoW1Mev4hLniuuI2Dm01Y1juOCv0RKelnOrZghDiQ/DuCNqfFG4wqN/KsUZGIUsYw5K74WN7kGJT8K1CBBidlj5NP0nkZ2nsS8MT2/A/U1UFZ82i3HvczDRgvvJC2mIeHBk3rPNn7KO9yyQDwNNKROm6AhgoQ9xh6VV9UhWUzcFgKVLN21kODQknJtoUyc5WseL2Ggi/grY2tNzKrcnlcBp3cb4qVnn4XZU8XeUA+KgWmk0xYs4jX6Ivu1qc0UGkY2o5faHAIHuYpjKFwHrhzX0AJgC1zcsTQNOJocvBURBj0cVBQ+po5vtrHcgXNkWSOyvFQ1yXOMNJZbLcbNGYvjlm73P3wThJtoGQhDNCTX58D3F0Yb14FhI5eeD+jI7gn00dhehA2H+RSfJKuM+UBldEq/JZzTOMoo0ECa24Ga2znJxYjoKzPJXdA6uysqcGM0PIrpXrq1n3OcozKbjmr/siO7LfOwyMQLSZIEXjPiqMg1SKlQoduA/0iPL7EcPdTOXG/rg5diEImX+1EfPFQrpMNqWjZ6GQaWuSR7+PpO/ObO6NUA3HAUwEHHQVYEEepCqLHQ2CDF5wJpCbgKYw/yyn2AOwbcY4PU6otR8zUxWqZcElrQPDL+x/aOSPRcGNXDoCSIliQMxf+poJ+EWJP0YJvGo0s9MEqh8InMRaCqIJSEwTL1yZ6YL5JpNE1jm8Aj4ZU8sQ58P11jGtMk+E3m2Xopwjbpu2Zvj6cTfL0vOJIlDzl5UhYaOtGYq+bOqMm4NICBVcEMgpsLV1n0AriWugRbLBI6jqT1wOv/IOmOpAQ04mOYRjW2ZdoGdhyIzN0CPikKdrs9Q7Iim/RwEdKt9AXTQbzF4SnhMiZ+QJXc3bV9w7b7+YPushYwWkx1STxR51CRWxeRq32krbbMjoXGrtUYGmq+Xd9OoFjN+Hp/m4qBe5fvQ23dDvuMjSKRLEwpGsZAO6qML2AjZhzPQAPdKCTM+yVvVpcojwNhoDPZDF7+6db144rYpTKaCn+xNHz1KtBZyxG6Tp3MdD+IBeYDlirRM034W5QDtq0uOdjnrMYBoMlhFqbvr/vFndswC8ejXus2zNzBJu5dT97POQvrJTSE2lKN0bZloNNq5O/kkMV8xZYsIuFNRW0EAdU1nxjbSOD+pJy/yA1I8siZCusqW4TH1ELkrL65aY+4Yi+Quqshfat5ALXjobenOUnWSDdQQfQk4J2G6sJbh3Pi/Ureh9OHO+pOhW8WGAJqAgWHcfpYNYZFeD6+z4f4dTX5KEV+Y3qM+urxjrL8mfRYr1g/bnzC6XvR3c6KijVQCQU2kvIRtVOPyFOL2y6xs+msXVda0FdpbUVrjVG1dh8KbKthwfEgqIC2mg5Cs2HDe28aNBZCdmOhHUn+VRyTl9plUuf6P7DrHvRcjevFQf4EjdlnSFuRpiTvo4OkA8vAUQXSu4fEGi2H0mv53/7VbVUYBpukRxpLkk3eTrgItqnctHQLp/90oeS/NqFY2W9UobjupVq3R7YmFRfrQilow+dqemL7xzX27GL6W6nhDNumaGruemhv+NZ7EpG6KXL2nkSoyfMzXc9jEtHvdP71fjao5ozanHWgD0FOTx8ymi6hw9qzDmlOKDPFvBsGdrXDlBFv7xz1ND0dMtfssec1YneCeM5GewI60hRcNBtlm8Ab2RQgvS1285issuJXuWEtUiUhu/am/TPk0jHjRFb7TTxKbp0r5d66mJpUDZRpA2XRi+YK4+UUqMf++IFWSBoTUDcmVSp9WCOfPVYnX29DVH4IdGzxukrIsYsbYrY0H/SubdU5k2VTr+YnAsiKOVvEafb8rg1aV1gEBC4NCGwDpr04EOGmYMVpZGHDdC0kwjRrN+hO+/ypxeq8VnV+f6Fv/MgH9jU4sF2056myQ70YlQj8l01uf93PbsXCH+/vb//u51EO7a7qIYnOGqkJ1f0ooGcFI1pkVGz8l9+rHmePccMeN9cZqJC64zYj1kXFsPpmNb+8+vIX3fwH \ No newline at end of file diff --git a/videoroom/assets/images/Illo_videoroom.png b/videoroom/assets/images/Illo_videoroom.png deleted file mode 100644 index e4c4134..0000000 Binary files a/videoroom/assets/images/Illo_videoroom.png and /dev/null differ diff --git a/videoroom/assets/images/SFU_scheme.png b/videoroom/assets/images/SFU_scheme.png deleted file mode 100644 index 0aea2b0..0000000 Binary files a/videoroom/assets/images/SFU_scheme.png and /dev/null differ diff --git a/videoroom/assets/images/client_flow1.png b/videoroom/assets/images/client_flow1.png deleted file mode 100644 index 2247fce..0000000 Binary files a/videoroom/assets/images/client_flow1.png and /dev/null differ diff --git a/videoroom/assets/images/client_flow2.png b/videoroom/assets/images/client_flow2.png deleted file mode 100644 index de2f106..0000000 Binary files a/videoroom/assets/images/client_flow2.png and /dev/null differ diff --git a/videoroom/assets/images/modular_rtc.png b/videoroom/assets/images/modular_rtc.png deleted file mode 100644 index 7b59b77..0000000 Binary files a/videoroom/assets/images/modular_rtc.png and /dev/null differ diff --git a/videoroom/assets/images/server_scheme.png b/videoroom/assets/images/server_scheme.png deleted file mode 100644 index 122254a..0000000 Binary files a/videoroom/assets/images/server_scheme.png and /dev/null differ diff --git a/videoroom/assets/images/total_scheme.png b/videoroom/assets/images/total_scheme.png deleted file mode 100644 index e7db9f5..0000000 Binary files a/videoroom/assets/images/total_scheme.png and /dev/null differ diff --git a/videoroom/assets/images/tutorial_graphic.svg b/videoroom/assets/images/tutorial_graphic.svg deleted file mode 100644 index 8d6bb3b..0000000 --- a/videoroom/assets/images/tutorial_graphic.svg +++ /dev/null @@ -1,9 +0,0 @@ - - - - - - - - - diff --git a/videoroom/assets/images/vad_flow_scheme.png b/videoroom/assets/images/vad_flow_scheme.png deleted file mode 100644 index 7441ce3..0000000 Binary files a/videoroom/assets/images/vad_flow_scheme.png and /dev/null differ diff --git a/videoroom/assets/records/expected_result.webp b/videoroom/assets/records/expected_result.webp deleted file mode 100644 index 5bb1158..0000000 Binary files a/videoroom/assets/records/expected_result.webp and /dev/null differ diff --git a/videoroom/assets/records/mute.webp b/videoroom/assets/records/mute.webp deleted file mode 100644 index 37a6a8f..0000000 Binary files a/videoroom/assets/records/mute.webp and /dev/null differ diff --git a/videoroom/assets/records/vad.webp b/videoroom/assets/records/vad.webp deleted file mode 100644 index e8e5551..0000000 Binary files a/videoroom/assets/records/vad.webp and /dev/null differ diff --git a/videoroom/index.md b/videoroom/index.md deleted file mode 100644 index 8548d78..0000000 --- a/videoroom/index.md +++ /dev/null @@ -1,16 +0,0 @@ ---- -title: Videoconferencing with Membrane -description: Create your very own videoconferencing room with a little help from the Membrane! -part: 5 -graphicPath: assets/images/Illo_videoroom.png ---- - -| number | title | file | -| ------ | ------------------------------ | ----------------------------------------- | -| 1 | Introduction | 1_Introduction.md | -| 2 | Environment preparation | 2_EnvironmentPreparation.md | -| 3 | System architecture | 3_SystemArchitecture.md | -| 4 | Server's communication channel | 4_CreatingServersCommunicationChannels.md | -| 5 | Server's room | 5_ImplementingServerRoom.md | -| 6 | Client's application | 6_ImplementingClientsApplication.md | -| 7 | Further steps | 7_FurtherSteps.md |