jami-docs

Forked version of Jami documentation, see wrycode.com/jami-docs-demo
git clone git://git.wrycode.com/wrycode/jami-docs.git
Log | Files | Refs

plugins.md (13911B)


      1 # Installing and Using Plugins
      2 
      3 **NOTE: this page describes what is a Jami Plugin and how to install and use them.**
      4 
      5 As from September of 2020, Jami team has added plugins as a call and chat feature for Linux, Windows, and Android users.
      6 This meaning that now you can personalize your call/chat experience by using one of our available plugins.
      7 But that is not all, you can also transform your awesome ideas into a brand new plugin!
      8 
      9 * To properly setup a plugin you must follow the steps in [#How to use it?](#how-to-use-it).
     10 * To build a available plugin, please refer to [#How to build?](#how-to-build) instructions.
     11 * To create your own plugin, please refer to [Create Plugin](7.1 - Create Plugin) instructions.
     12 
     13 ## How it works?
     14 Jami can be break down to three main components that interact together: Daemon, LRC and clients.
     15 Daemon is the core of Jami, and although it does not interact with users, it is involved in every
     16 command. Therefore, Daemon has a `JamiPluginManager` class that among other actions perfoms install/uninstall, load/unload, edit preferences and control plugins' usage.
     17 Despite Daemon importance, what a plugin effectivelly does to a call video/audio or to a chat message is unknown to it the same way Daemon does not know what is effectivelly done by LRC or the clients interfaces.
     18 Plugins then can be seen as a forth interacting component in Jami.
     19 
     20 The plugin system inside Jami exposes different APIs that can be used by the plugins. For instance, the ChatHandler and the Media Handler API. The latter enables the plugins to modify audio and video streams from Jami calls and is used by our GreenScreen plugin but could also be used to build a YouTube streaming system, various instagram-style filters, a real time translation service, etc.
     21 
     22 Plugins can be composed by one or multiple media and chat handlers that are responsible for attaching/detaching a data stream from Jami and a data process. Each handler represents a functionality that can be totally different between them or can be a modified versions of the same core process. In our example, we have only one functionality, it being, the GreenScreen plugin has one media handler which data process is responsible for segmenting the foreground from a video frame and applying another image to the background, just like it is done with the green screens in movies!
     23 
     24 To use one custom functionality, it is necessary that Jami knows all plugins' handlers, which one is going to be used and the data that should be processed. Plugin's handlers are created once a plugin is loaded and are shared with Daemon's Plugin Manager. The data is inside Jami flow (for a call plugin, in the event of a new call, Jami creates and stores the corresponding media stream subjects). Finally, once a user puts a plugin functionality in action Jami tells this handler to attach the available data. When deactivated, Jami tells the handler to dettach.
     25 
     26 ## How to use it?
     27 
     28 ### Setup
     29 A Jami plugin is a `pluginname.jpl` file, and it must be installed to your Jami.
     30 Once installed, Jami will add your new plugin to the available plugins list but they will not be ready for use yet. Plugins are libraries and must be loaded if you want to expose them.
     31 Moreover, plugin may have preferences and besides install/uninstall and load/unload actions, it is possible to modify those preferences. For example, our GreenScreen plugin allows the user to change the background image displayed.
     32 
     33 #### Android
     34 To setup a plugin for Android you must go under Setting, enable plugins if they are disabled, and select a plugin file from your phone.
     35 After installed it is automaticaly loaded.
     36 Optionally, you can manually perform load/unload using the checkbox button on the plugin list.
     37 
     38 ![androidInstall](androidInstall.gif)
     39 
     40 For Android uninstall, you must click on the plugin and a uninstall option will appear allong with the preferences and a reset preferences option.
     41 In order to a preference modification can take effect the plugin has to be reloaded.
     42 
     43 ![androidPrefs](androidPrefs.gif)
     44 
     45 #### Linux/Windows
     46 Similarly, for the client-qt available on Linux and Windows and for the client-gnome available only on Linux, you must go to Prefences, enable plugins if it is disabled, and select a plugins file from your computer.
     47 Each plugin in the shown list is linked to two buttons beeing:
     48 
     49 - Client-qt: a load/unload button and a preferences button;
     50 - Client-gnome: a load/unload button and a uninstall button;
     51 For client-gnome it is not possible to change plugin's preferences.
     52 
     53 ![clientqtInstall](clientqtInstall.gif)
     54 ![clientqtPrefs](clientqtPrefs.gif)
     55 
     56 ### Use!
     57 A media handler functionality only takes place if you turn them on during a call.
     58 For either Android, Linux or Windows you can do so by clicking on the plugins icon on your call screen.
     59 
     60 ![androidCall](androidCall.gif)
     61 ![clienqtCall](clienqtCall.gif)
     62 
     63 Similarly, for chat handler functionality, you will see a plugin icon in the chat window as in the images bellow.
     64 
     65 ## How to build?
     66 If you want to make something with your video call, it is possible that you will do so with OpenCV and/or deep learning models (Tensorflow, PyTorch, etc).
     67 So, before going to the plugin, it is necessary to build plugin's dependencies.
     68 
     69 ### Dependencies
     70 Here we give you the steps to build [OpenCV](https://opencv.org/) and [ONNX](https://www.onnxruntime.ai/) but do not feel limited to these libraries. We offer a [page](7.2 - Tensorflow Plugin) with detailled explanation of how to build tensorflow C++ API for Windows, Linux and Android.
     71 Other libraries should work as long they and the plugin are correctly built!
     72 
     73 #### OpenCV 4.1.1
     74 We kindly added OpenCV 4.1.1 as a contrib in [daemon](https://git.jami.net/savoirfairelinux/ring-daemon/tree/master/contrib).
     75 This way you can easily build OpenCV for Android, Linux, and Windows. You only have to follow the corresponding instructions.
     76 
     77 ##### Windows
     78 ```bash
     79 set DAEMON=<path/to/daemon>
     80 cd ${DAEMON}/compat/msvc
     81 python3 winmake.py -fb opencv
     82 ```
     83 
     84 ##### Linux
     85 With Docker (recommended):
     86 ```bash
     87 export DAEMON=<path/to/daemon>
     88 cd ${DAEMON}/../
     89 docker build -f plugins/docker/Dockerfile_ubuntu_18.04_onnxruntime -t plugins-linux .
     90 docker run --rm -it -v ${DAEMON}/../:/home/plugins/jami:rw plugins-linux:latest /bin/bash
     91 cd jami/plugins/contrib
     92 cd ../../daemon/contrib
     93 mkdir native
     94 cd native
     95 ../bootstrap --disable-argon2 --disable-asio --disable-fmt --disable-gcrypt --disable-gmp --disable-gnutls --disable-gpg-error --disable-gsm --disable-http_parser --disable-iconv --disable-jack --disable-jsoncpp --disable-libarchive --disable-libressl --disable-msgpack --disable-natpmp --disable-nettle --enable-opencv --disable-opendht --disable-pjproject --disable-portaudio --disable-restinio --disable-secp256k1 --disable-speexdsp --disable-upnp --disable-uuid --disable-yaml-cpp --disable-zlib
     96 make list
     97 make fetch opencv opencv_contrib
     98 make
     99 ```
    100 
    101 Using your own system:
    102 ```bash
    103 export DAEMON=<path/to/daemon>
    104 cd ${DAEMON}/contrib/native
    105 ../bootstrap --enable-ffmpeg --disable-argon2 --disable-asio --disable-fmt --disable-gcrypt --disable-gmp --disable-gnutls --disable-gpg-error --disable-gsm --disable-http_parser --disable-iconv --disable-jack --disable-jsoncpp --disable-libarchive --disable-libressl --disable-msgpack --disable-natpmp --disable-nettle --enable-opencv --disable-opendht --disable-pjproject --disable-portaudio --disable-restinio --disable-secp256k1 --disable-speexdsp --disable-upnp --disable-uuid --disable-yaml-cpp --disable-zlib
    106 make list
    107 make fetch opencv opencv_contrib
    108 make
    109 ```
    110 
    111 ##### Android
    112 Using Docker (recommended):
    113 ```bash
    114 export DAEMON=<path/to/daemon>
    115 cd ${DAEMON}/../
    116 docker build -f plugins/docker/Dockerfile_android_onnxruntime -t plugins-android .
    117 docker run --rm -it ${DAEMON}/:/home/gradle/src:rw plugins-android:latest /bin/bash
    118 cd plugins/contrib
    119 ANDROID_ABI="arm64-v8a" sh build-dependencies.sh
    120 ```
    121 
    122 Using your own system:
    123 ```bash
    124 export DAEMON=<path/to/daemon>
    125 cd ${DAEMON}
    126 export ANDROID_NDK=<NDK>
    127 export ANDROID_ABI=arm64-v8a
    128 export ANDROID_API=29
    129 export TOOLCHAIN=$ANDROID_NDK/toolchains/llvm/prebuilt/linux-x86_64
    130 export TARGET=aarch64-linux-android
    131 export CC=$TOOLCHAIN/bin/$TARGET$ANDROID_API-clang
    132 export CXX=$TOOLCHAIN/bin/$TARGET$ANDROID_API-clang++
    133 export AR=$TOOLCHAIN/bin/$TARGET-ar
    134 export LD=$TOOLCHAIN/bin/$TARGET-ld
    135 export RANLIB=$TOOLCHAIN/bin/$TARGET-ranlib
    136 export STRIP=$TOOLCHAIN/bin/$TARGET-strip
    137 export PATH=$PATH:$TOOLCHAIN/bin
    138 cd contrib
    139 mkdir native-${TARGET}
    140 cd native-${TARGET}
    141 ../bootstrap --build=x86_64-pc-linux-gnu --host=$TARGET$ANDROID_API --enable-opencv --enable-opencv_contrib
    142 make
    143 ```
    144 
    145 #### Onnxruntime 1.6.0
    146 A difficulty for a lot of people working with deep learning models is how to deploy them.
    147 With that in mind we provide the user the possibility of using onnxruntime.
    148 There are several development libraries to train and test but, they are usually too heavy to deploy. Tensorflow with cuda support, for instance, can easily surpass 400MB. In our GreenScreen plugin We chose to use onnxruntime because it's lighter (library size of 140Mb for cuda support) and supports model convertion from several development libraries (Tensorflow, PyTorch, Caffe, etc.).
    149 
    150 * For more advanced and curious third-party developpers, we also [provide instructions](7.2 - Tensorflow Plugin) to build Tensorflow C++ API for Windows and Linux, and the TensorflowLite C++ API for Android.
    151 
    152 To build onnxruntime based plugins for Linux and Android, we strongly recommend using docker files available under `<plugins>/docker/`. We don't offer Windows docker, but here we carefully guide you through the proper build of this library for our three supported platforms.
    153 
    154 If you want to build onnxruntime with Nvidia GPU suport, be sure to have a CUDA capable GPU and that you have
    155 followed all installation steps for the Nvidia drivers, CUDA Toolkit, CUDNN, and that their versions
    156 match.
    157 
    158 The following links may be very helpfull:
    159 
    160 * https://developer.nvidia.com/cuda-gpus
    161 * https://developer.nvidia.com/cuda-toolkit-archive
    162 * https://developer.nvidia.com/cudnn
    163 
    164 #### Linux and Android
    165 We kindly added onnxruntime as a contrib in [daemon](https://git.jami.net/savoirfairelinux/ring-daemon/tree/master/contrib).
    166 This way you can easily build onnxruntime for Android, and Linux.
    167 
    168 * Linux - Without acceleration:
    169 
    170 ```bash
    171 export DAEMON=<path/to/daemon>
    172 cd ${DAEMON}/contrib/native
    173 ../bootstrap
    174 make .onnx
    175 ```
    176 
    177 * Linux - With CUDA acceleration (CUDA 10.2):
    178 
    179 ```bash
    180 export CUDA_PATH=/usr/local/cuda/
    181 export CUDA_HOME=${CUDA_PATH}
    182 export CUDNN_PATH=/usr/lib/x86_64-linux-gnu/
    183 export CUDNN_HOME=${CUDNN_PATH}
    184 export CUDA_VERSION=10.2
    185 export USE_NVIDIA=True
    186 export DAEMON=<path/to/daemon>
    187 cd ${DAEMON}/contrib/native
    188 ../bootstrap
    189 make .onnx
    190 ```
    191 
    192 * Android - With NNAPI acceleration:
    193 
    194 ```bash
    195 export DAEMON=<path/to/daemon>
    196 cd ${DAEMON}
    197 export ANDROID_NDK=<NDK>
    198 export ANDROID_ABI=arm64-v8a
    199 export ANDROID_API=29
    200 export TOOLCHAIN=$ANDROID_NDK/toolchains/llvm/prebuilt/linux-x86_64
    201 export TARGET=aarch64-linux-android
    202 export CC=$TOOLCHAIN/bin/$TARGET$ANDROID_API-clang
    203 export CXX=$TOOLCHAIN/bin/$TARGET$ANDROID_API-clang++
    204 export AR=$TOOLCHAIN/bin/$TARGET-ar
    205 export LD=$TOOLCHAIN/bin/$TARGET-ld
    206 export RANLIB=$TOOLCHAIN/bin/$TARGET-ranlib
    207 export STRIP=$TOOLCHAIN/bin/$TARGET-strip
    208 export PATH=$PATH:$TOOLCHAIN/bin
    209 cd contrib
    210 mkdir native-${TARGET}
    211 cd native-${TARGET}
    212 ../bootstrap --build=x86_64-pc-linux-gnu --host=$TARGET$ANDROID_API
    213 make .onnx
    214 ```
    215 
    216 #### Windows
    217 
    218 * Pre-build:
    219 
    220 ```bash
    221 mkdir pluginsEnv
    222 export PLUGIN_ENV=<full-path/pluginsEnv>
    223 cd pluginsEnv
    224 mkdir onnxruntime
    225 mkdir onnxruntime/cpu
    226 mkdir onnxruntime/nvidia-gpu
    227 mkdir onnxruntime/include
    228 git clone https://github.com/microsoft/onnxruntime.git onnx
    229 cd onnx
    230 git checkout v1.6.0 && git checkout -b v1.6.0
    231 ```
    232 
    233 * Without acceleration:
    234 
    235 ```
    236 .\build.bat --config Release --build_shared_lib --parallel --cmake_generator "Visual Studio 16 2019"
    237 cp ./build/Windows/Release/Release/onnxruntime.dll ../onnxruntime/cpu/onnxruntime.dll
    238 ```
    239 * With CUDA acceleration (CUDA 10.2):
    240 
    241 ```
    242 .\build.bat --config Release --build_shared_lib --parallel --cmake_generator "Visual Studio 16 2019"
    243 --use_cuda --cudnn_home <cudnn home path> --cuda_home <cuda home path> --cuda_version 10.2
    244 cp ./build/Windows/Release/Release/onnxruntime.dll ../onnxruntime/nvidia-gpu/onnxruntime.dll
    245 ```
    246 * Post-build:
    247 
    248 ```bash
    249 cp -r ./include/onnxruntime/core/ ../onnxruntime/include/
    250 ```
    251 
    252 For further build instructions, please refer to onnxruntime official [GitHub](https://github.com/microsoft/onnxruntime/blob/master/BUILD.md).
    253 
    254 ### Plugin
    255 To exemplify a plugin build, we will use the GreenScreen plugin available [here](https://git.jami.net/savoirfairelinux/jami-plugins).
    256 
    257 #### Linux/Android
    258 First you need to go to the plugins repository in your cloned ring-project:
    259 
    260 ```bash
    261 cd <ring-project>/plugins
    262 ```
    263 
    264 - Linux - Nvidia GPU
    265 
    266 `PROCESSOR=NVIDIA python3 build-plugin.py --projects=GreenScreen`
    267 - Linux - CPU
    268 
    269 `python3 build-plugin.py --projects=GreenScreen`
    270 - Android
    271 
    272 ```bash
    273 export JAVA_HOME=/usr/lib/jvm/java-1.8.0-openjdk-amd64/jre
    274 export ANDROID_HOME=/home/${USER}/Android/Sdk
    275 export ANDROID_SDK=${ANDROID_HOME}
    276 export ANDROID_NDK=${ANDROID_HOME}/ndk/21.1.6352462
    277 export ANDROID_NDK_ROOT=${ANDROID_NDK}
    278 export PATH=${PATH}:${ANDROID_HOME}/tools:${ANDROID_HOME}/platform-tools:${ANDROID_NDK}:${JAVA_HOME}/bin
    279 ANDROID_ABI="arm64-v8a armeabi-v7a x86_64" python3 build-plugin.py --projects=GreenScreen --distribution=android
    280 ```
    281 
    282 The GreenScreen.jpl file will be available under `<ring-project/plugins/build/>`.
    283 
    284 #### Windows
    285 Windows build of plugins are linked with the daemon repository and its build scripts. So to build our example plugins you have to:
    286 
    287 ```bash
    288 cd <ring-project>/daemon/compat/msvc
    289 python3 winmake.py -fb GreenScreen
    290 ```
    291 
    292 The GreenScreen.jpl file will be available under `<ring-project/plugins/build/>`.
    293 
    294 
    295 Related articles:
    296 
    297 - https://jami.net/plugins-sdk/