To optimize neural network, we offer Neural Network SDK (NN SDK). The NN SDK contains Training Toolkit for training and Inference Toolkit for inferencing. In the training process, users replace their neurons by augmented neurons using Training Toolkit. The Training Toolkit is based on Tensorflow, and users follow similar training flow that they would follow otherwise.
Once the desired optimization parameters (latency vs throughput vs memory vs compute) are achieved, users use Inference Toolkit to run production inference. The inference toolkit is optimized for AVX2 instruction set on X86. The output NN can be used in conjunction with OpenVino in user application.
We also provide you with reference implementation of selected models for computer vision. These include:
- Mobilenet V1
- Mobilenet V2
- Yolo V3
- Resnet 50
To learn more about our optimization technology, please contact us or email aidnn@nanosemitech.com