Re: Python dependencies in add-ons


Noelia Ruiz
 

Hi, for the question about c++, as far as I know this is not possible.
You need at least certain Python files to store classes, as described
in the developer guide.
I'm very interested on your add-on. Hope you can share it with us and,
if you want, send it here for review so we can include it in the
add-ons website according with the future add-ons workflow when NV
Access can work on it, as planned by them.

Thanks for this project.

2020-06-08 17:06 GMT+02:00, Christopher Pross <chpross42@...>:

Hey,

I had the same question for my add-on regarding the webcam face-guding
add-on. It uses deep-learned open-cv networks from open-cv. you need for
open-cv numpy. So I had solved this.

But only with building myself open-cv with an python 32-bit version,
which is also conpatible with nvda and with getting a pre-binary from
numpy, unpack it and copy the numpy folder in the folder were the addon
is. Then you have to at temporaly add the root folder of the add-on to
the python path with sys.path.
The self-builded version of open-cv contained a .pyd file, together with
an python loader from open-cv you can load easyly open-cv from nvda-addons.

I hope yo could follow me.


all the best,

christopher

Am 08.06.2020 um 13:19 schrieb Shubham Jain:
Hello!

As part of my GSoC project
<https://summerofcode.withgoogle.com/projects/#6039693356957696>, I am
writing an add-on that allows users to get descriptions of images. To
work, the ML models depend on some python libraries like Numpy,
Pillow, onnxruntime and OpenCv. My questions are:

* Is it possible to package these libraries in an add-on?
* Since I only require a few specific functions from these
libraries, is it possible to only package those parts into the
add-on?


Alternatively, the models could be converted to run using native C++
by depending on the LibTorch library. Is it possible to write add-ons
in native C++?

*regards,*
*Shubham Jain*


Join nvda-addons@nvda-addons.groups.io to automatically receive all group messages.