2
$\begingroup$

I'm trying to understand Sakurai's explanation leading up to the projection operator, pp. 17-18 (Section 1.3.2), but I'm slightly confused by the notation. So he first says that an arbitrary ket $|\alpha\rangle$ can be represented as $$|\alpha \rangle = \sum_{a'}c_{a'}|a'\rangle.$$

He then multiplies the left by $\langle a''\vert$ and then uses the orthonormality property ($\langle a'' | a'\rangle = \delta_{a'' a'}$ where $a'$ and $a''$ pertain to different eigenkets of $A$). He then gets that $$c_{a'} = \langle a' | \alpha\rangle$$.

My questions are:

  • Aren't we summing over the eigenkets of $A$, so is $\langle a'' |$ just an arbitary bra that we are using?
  • I cannot find how he is getting to the second part after trying to multiply on both sides. Since the eigenkets are orthogonal, as previously proven, it will be 0, so I find $|\alpha \rangle$ is 0.
  • Is it one coefficient for each eigenket, and we just found the coefficient for the first eigenket?
  • Later on in the book, Sakurai constructs the square matrix of an operator as $\langle a'' | X | a' \rangle$. Are these arbitrary kets, and why do they appear next to the operator? It reminds me of how Griffith's defined an operator using $\psi$ and $\phi$. Is this the same?
$\endgroup$
3
  • $\begingroup$ $\delta_{a''\,a'}\neq0$ for $a''=a'$. $\endgroup$ Commented Aug 31 at 19:45
  • $\begingroup$ @CosmasZachos Yes, that makes more sense. $\endgroup$ Commented Aug 31 at 20:05
  • $\begingroup$ @KyleKanos, So would I have to assume that $a'' = a'$ and in that case you get $\langle a' | \alpha \rangle = c_{a'}$. Ah, I see now. $\endgroup$ Commented Aug 31 at 20:08

1 Answer 1

10
$\begingroup$

I'm trying to understand Sakurai's explanation leading up to the projection operator, pp. 17-18 (Section 1.3.2), but I'm slightly confused by the notation. So he first says that an arbitrary ket $|\alpha\rangle$ can be represented as $$|\alpha \rangle = \sum_{a'}c_{a'}|a'\rangle.$$

When you are confused, try writing out a finite example.

For example, maybe consider just three basis kets. It might literally be the case, it might not, but seeing the more concrete example can help to understand the more abstract example.

In the three basis-ket case we have:
$$ |\alpha\rangle = c_1|1\rangle+c_2|2\rangle+c_3|3\rangle\;. $$

And, as you noted, there is no "dependence" on $a'$, since it is just a dummy summation variable.

He then multiplies the left by $\langle a''\vert$

Here, this $a''$ means that he gets to arbitrarily choose whichever of the three basis states he wants. Here, $a''$ is a "free" variable, not a "dummy" summation variable.

So, for example in the three state case, $\langle a''|$ could be $\langle 1|$, or $\langle 2|$, or $\langle 3|$.

Suppose we chose $\langle 2|$. Then we have $$ \langle 2|\alpha\rangle = \langle 2|(c_1|1\rangle+c_2|2\rangle+c_3|3\rangle) = c_1\times 0 + c_2\times 1 + c_3\times 0 = c_2 \;. $$

It should be straightforward to see that if we chose $\langle a''|$ to be $\langle 1|$ we would end up with $c_1$ and if we chose $\langle a''|$ to be $\langle 3|$ we would end up with $c_3$.

and then uses the orthonormality property ($\langle a'' | a'\rangle = \delta_{a'' a'}$ where $a'$ and $a''$ pertain to different eigenkets of $A$). He then gets that $$c_{a'} = \langle a' | \alpha\rangle\;.$$

Yes, this notation just summarizes the fact that we really could have chosen $a''$ to be whatever basis state index we wanted. We don't have to write out every concrete case explicitly. In the above equation $a'$ is now a "free" index. It is not a "dummy" index like it was when it represented the summation.

Aren't we summing over the eigenkets of $A$, so is $\langle a'' |$ just an arbitary bra that we are using?

Yes, it is arbitrary since the $a''$ is a "free" index, whereas the $a'$ was a "dummy" summation index in the sum.

I cannot find how he is getting to the second part after trying to multiply on both sides. Since the eigenkets are orthogonal, as previously proven, it will be 0, so I find $|\alpha \rangle$ is 0.

It is only zero for some (most) of the terms. For exactly one term it is non-zero and equal to $1$.

Is it one coefficient for each eigenket, and we just found the coefficient for the first eigenket?

Yes, it is one coefficient for each term. We found the coefficient for one of the terms.

The sum has one coefficient for each eigenket. And Sakurai is showing how taking the inner product of $|\alpha\rangle$ with the eigenbra is equal to the coefficient of the corresponding eigenket.

Later on in the book, Sakurai constructs the square matrix of an operator as $\langle a'' | X | a' \rangle$. Are these arbitrary kets,

Yes, in this case $a'$ and $a''$ are both arbitrary "free" indices.

and why do they appear next to the operator?

They appear next to the operator because the operator operates on kets (e.g., some $|a'\rangle$) to produce kets. And kets can be taken to have an inner product with bras (e.g., the inner product with some $\langle a''|)$.

It reminds me of how Griffith's defined an operator using $\psi$ and $\phi$. Is this the same?

Maybe. Probably, I guess. But you do not provide enough information to say for sure.

$\endgroup$

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.