The below is my reading of a topological sort algorithm on a queue, written in my textbook:
void topologicalsort(struct Graph* G){
struct queue* Q;
int counter;
int v,w;
Q=createqueue();
counter=0;
for(v=0;v<G->V;v++){
if(indegree[v]==0)
enqueue(Q,v);
while(!isemptyqueue(Q)){
v=dequeue(Q);
topologicalorder[v]=++counter;
for each w to adjacent to v
if(--indegree[w]==0){
enqueue(Q,w);
}
}
}
}
The algorithm is failing for the following graph:
If in the given graph initially 7 5 3 has in-degree zero then they will be inserted in the queue but for any of the vertices adjacent to 7 5 3 we are not having any vertex with degree 1. This implies that if(--indegree[w]==0) will not hold true for 7 5 3 hence there will be no further enqueuing inside the queue, and hence the algorithm will not process further vertices. I'd like to know why is the algorithm failing if the graph is a DAG? In which way is it incorrect?
I know that we can also implement topological sort using DFS, but I want to implement the below as-is:


struct queue,struct Graph,createqueue(),dequeue(),enqueue()andindegree...for each w to ...? If you're just using psuedocode, why is this tagged C?