The input array is x with dimensions (1 x 3) and the output array is 3 x 3 (column of input x column of input). The output array's diagonals are the values^2. If row != column, then the formula is x(row)+x(col) for each value. Currently for 1 x 3 but should assume a variety of dimensions as input. Cannot use 'def'. The current code does not work, what would you recommend?
x = np.array([[0, 5, 10]])
output array formulas =
[[i^2, x(row)+x(col), x(row)+x(col)]
[x(row)+x(col), i^2, x(row)+x(col)]
[x(row)+x(col), x(row)+x(col), i^2]]
# where row and column refer to the output matrix row, column. For example, the value in (1,2) is x(1)+x(2)= 5
ideal output =
[[0 5 10]
[5 25 15]
[10 15 100]]
Code Attempted:
x = np.array([[0, 5, 10]])
r, c = np.shape(x)
results = np.zeros((c, c))
g[range(c), range(c)] = x**2
for i in x:
for j in i:
results[i,j] = x[i]+x[j]
[n[i][j]**2 if i == j else n[i][j] for j in i for i in n]. (obviously you would have to get the actual indexes of i and j so this code is just an algorithm example. But you could useenumerate())