0

I have just started with libtorch and I am having some trouble with while loops and tensors :roll_eyes: So, my main func looks like so:

int main()
{

  auto tensor_create_options = torch::TensorOptions().dtype(torch::kFloat32).device(torch::kCPU).requires_grad(false);
  torch::Tensor randn_tensor = torch::randn({10}, tensor_create_options);

  int randn_tensor_size = randn_tensor.sizes()[0];

  while (randn_tensor_size > 5)
  {
    std::cout << "==> randn_tensor shape: " << randn_tensor.sizes() << std::endl;
    randn_tensor.reset(); //reset();
    torch::Tensor randn_tensor = torch::randn({3}, tensor_create_options);
    std::cout << "==> randn_tensor shape 3: " << randn_tensor.sizes() << std::endl;
randn_tensor_size--;
  }

  return 0;
}

and I get thrown this:

==> randn_tensor shape: [10]
==> randn_tensor shape 3: [3]
terminate called after throwing an instance of 'c10::Error'
  what():  sizes() called on undefined Tensor

Essentially, what I want to do is recreate the tensor within the while loop and ensure that I can access it again in the whileloop.

Interestingly, it seems to have cerated the tensor of reduced size but the while loop does not seem to recognise this.

Thank you!

1
  • torch::Tensor randn_tensor : In order to avoid unnecessary confusion, remove the shadowing here. You've already declared an object of this name within the outer scope! Commented Feb 26, 2021 at 10:08

1 Answer 1

1

You have a shadowing issue here, try the loop that way:

while (randn_tensor_size > 5)
{
    std::cout << "==> randn_tensor shape: " << randn_tensor.sizes() << std::endl;
    randn_tensor.reset(); //reset();
    randn_tensor = torch::randn({3}, tensor_create_options);
    std::cout << "==> randn_tensor shape 3: " << randn_tensor.sizes() << std::endl;
    randn_tensor_size--;
}

Maybe, the reset of the tensor isn't necessary further on, depends of the internals of this class. If that's not the actual intention of your code and you want only the original tensor to be deleted, then simply reset that one right before the loop. Indepently of this, try to make the code clearer in terms of intention emphasis! I do not really understand what you want to achieve exactly. Your loop counter is misued at all since you mix size and counting semantics, depending on the initial size only. Within the loop, you simply recreate the tensor on the stack again and again, not affecting your counter.

Sign up to request clarification or add additional context in comments.

5 Comments

hmm but the tensords are of different sizes.. so I was not sure if I can just allocate the same tensor variable to a reduced size tensor. Anyways, I will try this now!
Maybe at first, you should be confident about the intentions here. Semantically, I do not really understand what you want to achieve with your code. See further edit here from mine.
you were correct - it was a shadowing issue. I also did not require the reset! but I am a bit surprised by this. originally I bdeclare the tensor to have 10 elements: surely the memory is now for this 10 elements? How come when I just resassign 3 elements it is fine? without deleting? Also, I see that if I use the auto keyword on the new tensor creation, it screws up. hmm
Without a proper renaming, auto does the same as within your original code. If reassigning is semantically correct for your abstract theoretical issue, than simply do it. Syntactically, it's totally fine at least.
ah I understand now (I think!).. Thank you for the detailed explanation @Secundi: really appreciate your time and help!

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.