This is an algorithm optimization problem.
I have an integer array A and want to build array B such that B[i] contains index j of the element in A[j] such that (B[i] = j)
1. j > i
2. A[j] < A[i]
3. j is minimum of all possible j's.
For example if A is:
A = [1,5,6,4,3,1,2]
Then B will be (indexing from 0)
B = [-1,3,3,4,5,-1,-1]
B[0] = -1 because there is no numbers less than 1 with indices more than 0.
B[1] = 3 because A[3] is the right-closest element to index 1 in A that contains number less than A[1]=5. And so forth.
I know the solution with O(n*log(n)) complexity using binary search tree. How can I enhance the complexity to O(n) or prove that it is impossible?