1

I trained simple model for MNIST using it https://github.com/pytorch/examples/blob/main/cpp/mnist/mnist.cpp

I added code for saving model like below

string model_path = "model.pt";
torch::serialize::OutputArchive output_archive;
model.save(output_archive);
output_archive.save_to(model_path);

and in other cpp file, I just tried like below

torch::jit::script::Module model;
model=torch::jit::load("model.pt");
model.to(device);

std::vector<torch::jit::IValue> inputs;
inputs.push_back(torch::ones({1, 1, 28, 28}));

at::Tensor output = model.forward(inputs).toTensor();
std::cout << output << std::endl;

It complied well, but had an error while executing

terminate called after throwing an instance of 'c10::Error'
  what():  Method 'forward' is not defined.
Exception raised from get_method at .../include/torch/csrc/jit/api/object.h:111 (most recent call first):
frame #0: c10::Error::Error(c10::SourceLocation, std::string) + 0x57 (0x7f8e0163f897 in .../lib/libc10.so)
frame #1: c10::detail::torchCheckFail(char const*, char const*, unsigned int, std::string const&) + 0x64 (0x7f8e015efb25 in .../lib/libc10.so)
frame #2: <unknown function> + 0xa729 (0x560eaa26e729 in ./example-app)
frame #3: <unknown function> + 0xa8b1 (0x560eaa26e8b1 in ./example-app)
frame #4: <unknown function> + 0x5355 (0x560eaa269355 in ./example-app)
frame #5: __libc_start_main + 0xe7 (0x7f8d8eea0c87 in /lib/x86_64-linux-gnu/libc.so.6)
frame #6: <unknown function> + 0x4bca (0x560eaa268bca in ./example-app)

How can I fix it?

I tried to use TORCH_MODULE to create a module holder and save the model like this ref : https://discuss.pytorch.org/t/libtorch-how-to-save-model-in-mnist-cpp-example/34234/5

but it didn't work (maybe I used it wrong)

2
  • You have pushed your model to the device, but you have not pushed your input tensor to the device, try inputs.push_back(torch::ones({1, 1, 28, 28}).to(device)); Commented Jun 5, 2024 at 7:37
  • @FlyingTeller Thank you but still same error occered Commented Jun 5, 2024 at 8:27

1 Answer 1

0

For loading I use:

torch::serialize::InputArchive in{};  
in.load_from("model.pt");  
model->load(in);

If your model is purely created and saved in C++ as it seems to be the case.

But you need to delete your model if you change it's layout - because if your layout is different (aka different layers) when you load and try to use the old one it will crash.

Sign up to request clarification or add additional context in comments.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.