There isn't enough information. You also need to know the distribution of the M values across the N vectors. When you have that, then it's straight forward to find the overall complexity:
std::sort has a complexity of O(N·log(N)) comparisons.
std::vector uses std::lexicographical_compare(v1, v2) for comparison, which has a complexity of O(min(v1.size(), v2.size())) comparisons.
int comparison has a complexity of O(1).
We'll let E(M, N) be a function on M, N that returns the mean number of minimum elements between every pair of inner vectors.
- For example, if you have a uniform distribution, this is
trivially equal to
M/N.
- Take the product:
Big Oh = N·log(N)·E(M, N)·1.
- For a uniform distribution, this would be
M·log(N).
You can use Discrete Probability Distribution theory to figure out what the E(M, N) function is for any distribution of M across N.
Edit 1: To drive the point of how/why this matters: Consider a distribution that always makes my vectors look like:
outer[0].size() == 1,
outer[1].size() == 1,
outer[2].size() == 1,
...,
outer[M-1].size() == (M - N + 1)
In this case, E(M, N) = 1, because std::lexicographical_compare will only ever have one other element to compare to for any pair of elements. Thus, for this particular distribution, I will always have a complexity of O(N·log(N)). But with a uniform distribution, I'll have O(M·log(N)).
Edit 2: Following the comment where you define your distribution, let's try and find the E(M, N).
First, notice that there are in total T = (N choose 2) = N(N - 1)(1/2) different combinations of vector comparisons.
One (and only one) combination will take X = O((M - N + 2)(1/2)) comparisons, and has probability P(X) = 1/T to occur.
Every other combination will require just 1 comparison (O(1)), and so those cases occur with probability P(1) = (T - 1)/T.
Finding the mean is simple: X·P(X) + 1·P(1).
Given this, WolframAlpha says: E(M, N) = (M + (N - 2) N)/((N - 1) N).
Multiplying that function by N log(N) gives us (M + (N - 2) N) log(N) / (N - 1), which can be further simplified to the Big Oh you're looking for: O((M/N + N) log(N)).