Description
Required prerequisites
- Make sure you've read the documentation. Your issue may be addressed there.
- Search the issue tracker and Discussions to verify that this hasn't already been reported. +1 or comment there if it has.
- Consider asking first in the Gitter chat room or in a Discussion.
What version (or hash if on master) of pybind11 are you using?
latest
Problem description
The python call to the pybind11 wrapped so file comes up with multiple destructions of my custom class。
// class 1:
class QnnModelLoader
{
public:
QnnModelLoader();
~QnnModelLoader();
std::shared_ptr<qnnInferenceEngine>load_model_from_qnn(std::string input_model_path, std::string backend_path, bool loadFromCachedBinary);
}
// class 2:
class qnnInferenceEngine
{
public:
qnnInferenceEngine(std::shared_ptr<qnn::tools::sample_app::QnnSampleApp> sampleApp, bool loadFromCachedBinary);
~qnnInferenceEngine();
std::vector<std::vector<float>> infer_model();
}
My code for Pybind11 is as follows:
PYBIND11_MODULE(qnn_loader, m) {
py::class_<qnnInferenceEngine>(m, "qnnInferenceEngine")
.def(py::init<std::shared_ptr<QnnSampleApp>, bool>())
.def("infer_model", &qnnInferenceEngine::infer_model);
py::class_<QnnModelLoader>(m, "QnnModelLoader")
.def(py::init<>())
.def("load_model_from_qnn", &QnnModelLoader::load_model_from_qnn);
}
My python call code is as follows:
def test_6(model_path, backend_path):
Loader = qnn_inference_engine.QnnModelLoader()
inferEngine = Loader.load_model_from_qnn(model_path, backend_path, False)
print("Hello World")
A strange thing came up at this point, I was printing the information of the call in the constructor and destructor functions I added in C++ in these two classes. python prints the information when called as follows:
The constructor of QnnModelLoader is called
The constructor of qnnInferenceEngine is called
The destructor of qnnInferenceEngine is called
Hello World
The destructor of QnnModelLoader is called
The destructor of qnnInferenceEngine is called
It makes me wonder why the destructor is called before Hello World.
Below I will provide a concrete implementation of the function load_model_from_qnn
std::shared_ptr<qnnInferenceEngine> QnnModelLoader::load_model_from_qnn(std::string model_path, std::string backend_path, bool loadFromCachedBinary)
{
// .......
m_sampleApp = std::make_shared<QnnSampleApp>(std::string{""});
return std::make_shared<qnnInferenceEngine>(m_sampleApp, m_loadFromCachedBinary);
}
For Load_model_from_qnn I tried using all the return mechanisms. But when I use it in Python all the print messages suggest me the same error, I don't know if I'm not using it correctly, my specific use is:
.def("load_model_from_qnn", &QnnModelLoader::load_model_from_qnn, return_value_policy::move);
I tried all of the following:
return_value_policy::take_ownership
return_value_policy::copy
return_value_policy::move
return_value_policy::reference
return_value_policy::reference_internal
return_value_policy::automatic
return_value_policy::automatic_reference
My core question is as follows.
1:In the PYBIND11_MODULE function, although the class QnnSampleApp appears, I don't want to expose it, and I don't know if I'm writing it this way correctly
2: Why the qnnInferenceEngine will be destructed in python directly after the creation of qnnInferenceEngine (the destruct message is printed in Hello World), is it because my PYBIND11_MODULE is not correct? If not, what should I do to change it?
If PYBIND11_MODULE is OK, is my load_model_from_qnn wrong? I've given the core details of the exact implementation, but it's fine in C++, and I know there are differences in the management mechanism between C++ and python, so how should I modify it?
Very much looking forward to your help, thanks!
Reproducible example code
Please refer to Problem description
Is this a regression? Put the last known working version here if it is.
Not a regression