The NX Integration SDK provides tools and examples for clients to create applications which seamlessly integrates with the NX software stack.
See the subdirectories for further documentation.
You probably have the integration SDK already if you're looking at this readme, the command to get the full integration SDK is as follows:
git clone https://github.com/scailable/sclbl-integration-sdk.git --recurse-submodulesIf you have downloaded the sdk previously, you can also update to the latest version of the integration SDK while in the directory of the downloaded git repository.
git pull --recurse-submodulesThe repository should be cloned with --recurse-submodules.
These applications can be compiled on any architecture natively in a Linux environment.
To compile, some software packages are required. These can be installed by running:
sudo apt install cmake
sudo apt install g++For the python pre/postprocessors the following is also required
sudo apt install python3-pip
sudo apt install python3-venv
sudo apt install patchelfThese applications can be run on any platform on which they can be compiled.
This project is CMake based, and all its modules can be compiled or gathered with CMake commands.
Because the different pre/postprocessors must be compiled for each hardware architecture this repository does not include pre-built binaries. All processors can be compiled manually.
Change into the directory created for the project if you're not already there.
cd nxai-integration-sdk/Prepare the build directory in the project directory, and switch to the build directory.
mkdir -p build
cd buildSet up a python virtual environment in the newly created build directory (on recent ubuntu servers this is required).
python3 -m venv integrationsdk
source integrationsdk/bin/activateSet up CMake configuration:
cmake ..Build a single processor:
cmake --build build --target <processor-name>
Build all targets:
cmake --build .This will build the default target, which includes the all the example applications that are active in the CMakeLists.txt.
It is possible to only run specific examples, refer to the readme files in the subdirectories of those examples for specific instructions.
Before installing make sure the target directory is writable.
sudo chmod 777 /opt/networkoptix-metavms/mediaserver/var/nx_ai_manager/nxai_manager/postprocessors/
sudo chmod 777 /opt/networkoptix-metavms/mediaserver/var/nx_ai_manager/nxai_manager/preprocessors/To install a single processor to the default folder:
cmake --install build --component <processor-name>
To install the generated pre/postprocessor examples to the default pre/postprocessors folder:
cmake --build . --target installCreate a configuration file at /opt/networkoptix-metavms/mediaserver/var/nx_ai_manager/nxai_manager/postprocessors/external_postprocessors.json or /opt/networkoptix-metavms/mediaserver/var/nx_ai_manager/nxai_manager/preprocessors/external_preprocessors.json and add the details of your pre/postprocessors to the root object of that file. For example the following file enables all python based pre/postprocessors:
{
"externalPostprocessors": [
{
"Name":"EI-Upload-Postprocessor",
"Command":"/opt/networkoptix-metavms/mediaserver/var/nx_ai_manager/nxai_manager/postprocessors/postprocessor-python-edgeimpulse-example",
"SocketPath":"/tmp/python-edgeimpulse-postprocessor.sock",
"ReceiveInputTensor": true,
"RunLast": false,
"NoResponse": true
},
{
"Name":"Example-Postprocessor",
"Command":"/opt/networkoptix-metavms/mediaserver/var/nx_ai_manager/nxai_manager/postprocessors/postprocessor-python-example",
"SocketPath":"/tmp/python-example-postprocessor.sock",
"ReceiveInputTensor": false
},
{
"Name":"Image-Postprocessor",
"Command":"/opt/networkoptix-metavms/mediaserver/var/nx_ai_manager/nxai_manager/postprocessors/postprocessor-python-image-example",
"SocketPath":"/tmp/python-image-postprocessor.sock",
"ReceiveInputTensor": true
},
{
"Name":"NoResponse-Postprocessor",
"Command":"/opt/networkoptix-metavms/mediaserver/var/nx_ai_manager/nxai_manager/postprocessors/postprocessor-python-noresponse-example",
"SocketPath":"/tmp/python-noresponse-postprocessor.sock",
"ReceiveInputTensor": false,
"ReceiveBinaryData": false,
"NoResponse": true
},
{
"Name":"Cloud-Inference-Postprocessor",
"Command":"/opt/networkoptix-metavms/mediaserver/var/nx_ai_manager/nxai_manager/postprocessors/postprocessor-cloud-inference-example",
"SocketPath":"/tmp/python-cloud-inference-postprocessor.sock",
"ReceiveInputTensor": true
}
]
}{
"externalPreprocessors": [
{
"Name":"Example-Preprocessor",
"Command":"/opt/networkoptix-metavms/mediaserver/var/nx_ai_manager/nxai_manager/preprocessors/preprocessor-python-example",
"SocketPath":"/tmp/example-preprocessor.sock"
}
]
}Finally, to (re)load your new pre/postprocessor, make sure to restart the NX Server with:
sudo service networkoptix-metavms-mediaserver restartYou also want to make sure the pre/postprocessor can be used by the NX AI Manager (this is the mostly same command as earlier)
sudo chmod -R a+x /opt/networkoptix-metavms/mediaserver/var/nx_ai_manager/nxai_manager/postprocessors/
sudo chmod -R a+x /opt/networkoptix-metavms/mediaserver/var/nx_ai_manager/nxai_manager/preprocessors/If the pre/postprocessor is defined correctly, its name should appear in the list of pre/postprocessors in the NX Cloud Pipelines UI. If it is selected in the pipeline settings then the NxAI Manager will send data to the pre/postprocessor and wait for its output.
For fast iterative development it is often desired not to fully recompile the postprocessor to have it reloaded.
Functionality has therefore been added to instruct the AI Manager not to start the external postprocessor, but instead assume it's running. The AI Manager will then attempt to send data to the external processor each frame, and wait for a response. You can then start the processor yourself to immediately see its effect. You can also print/manipulate data easily to get familiar with the format of the data that the processor receives.
Once you are satisfied with how your processor behaves, you can then compile and package it.
To instruct the AI Manager to not start the external processor, simply omit the "Command" field in the processor definition.
Copyright 2025, Network Optix, All rights reserved.