0

I am making a chatbot using langgraph each node allocated to teach a topic e.g AI, Prompts and LLM etc. i am using this Docuemnt: https://langchain-ai.github.io/langgraph/how-tos/multi-agent-network/ here is the code

from typing_extensions import Literal
from langchain_openai import ChatOpenAI
from langchain_core.messages import ToolMessage
from langchain_core.tools import tool
from langgraph.graph import MessagesState, StateGraph, START
from langgraph.checkpoint.memory import MemorySaver
from langgraph.types import Command


model = ChatOpenAI(model="gpt-4o", temperature=0)

# Define a helper for each of the agent nodes to call


@tool
def transfer_to_chat_handler():
    """Ask travel advisor for help."""
    # This tool is not returning anything: we're just using it
    # as a way for LLM to signal that it needs to hand off to another agent
    # (See the paragraph above)
    return


@tool
def transfer_to_teach_ai():
    """Ask chat handler for help."""
    return
@tool
def transfer_to_teach_prompt():
    """Ask chat handler for help."""
    return

def chathandler(
    state: MessagesState,
) -> Command[Literal["teach_ai", "__end__"]]:
    system_prompt = (
        "Your name is arti, you are a friendly teaching chatbot, teaching students about different topics"
        "you only job is get user's name and age and direct user to next lesson"
        # "here is the list of teachers: 'AI Teacher', 'Prompt Teacher"
        "trigger and transition to 'teach_ai' when you recieve user's name and age"
        # "trigger and transition to 'teach_prompt' when user is cleared about AI"
    )
    messages = [{"role": "system", "content": system_prompt}] + state["messages"]
    ai_msg = model.bind_tools([transfer_to_teach_ai]).invoke(messages)
    # If there are tool calls, the LLM needs to hand off to another agent
    if len(ai_msg.tool_calls) > 0:
        if "tool_calls" in ai_msg.additional_kwargs:
          tool_name = ai_msg.additional_kwargs["tool_calls"][0]["function"]["name"]

          if tool_name == "transfer_to_teach_ai":
              tool_call_id = ai_msg.tool_calls[-1]["id"]
              tool_msg = {
                  "role": "tool",
                  "content": "Successfully transferred to teach_ai",
                  "tool_call_id": tool_call_id,
              }
              return Command(goto="teach _ai", update={"messages": [ai_msg, tool_msg]})
          # elif tool_name == "transfer_to_teach_prompt":
          #     tool_call_id = ai_msg.tool_calls[-1]["id"]
          #     tool_msg = {
          #         "role": "tool",
          #         "content": "Successfully transferred to teach_prompt",
          #         "tool_call_id": tool_call_id,
          #     }
          #     return Command(goto="teach_prompt", update={"messages": [ai_msg, tool_msg]})

    return {"messages": [ai_msg]}


def teach_ai(
    state: MessagesState,
    ) -> Command[Literal[ "chathandler", "__end__"]]:
    system_prompt = (
        "introduce yourself that you will teach about AI in a fun and clear way.\n"
        "give a proper definition and examples"
        "ask user if they have any follow up questions"
        "If the user has completely understand the lesson, trigger a transition to 'chathandler'"
    )

    messages = [{"role": "system", "content": system_prompt}] + state["messages"]
    ai_msg = model.bind_tools([transfer_to_chat_handler]).invoke(messages)
    # If there are tool calls, the LLM needs to hand off to another agent
    if "tool_calls" in ai_msg.additional_kwargs:
          tool_name = ai_msg.additional_kwargs["tool_calls"][0]["function"]["name"]

          if tool_name == "transfer_to_chat_handler":
              tool_call_id = ai_msg.tool_calls[-1]["id"]
              tool_msg = {
                  "role": "tool",
                  "content": "Successfully transferred to chat_handler",
                  "tool_call_id": tool_call_id,
              }
              return Command(goto="chathandler", update={"messages": [ai_msg, tool_msg]})

    # If the expert has an answer, return it directly to the user
    return {"messages": [ai_msg]}


builder = StateGraph(MessagesState)
builder.add_node("chathandler", chathandler)
builder.add_node("teach_ai", teach_ai)
# we'll always start with a general travel advisor
builder.add_edge(START, "chathandler")
memory = MemorySaver()
graph = builder.compile(checkpointer=memory)

i have added "chathandler" to initiate conversation. as soon as you get the user's name and age trigger the next node you can see a func: transfer_to_teach_prompt that was being used to trigger teach_prompt node after completing teach_ai node i added if tool_name == "transfer_to_teach_ai": so that i can add more nodes later on however i have problem traversing node. as soon as the teach_ai is triggered i do not see anything on the response side here is a screenshot enter image description here

here is a code for running the bot for your reference

config = {"configurable": {"user_id": "3", "thread_id": "1"}}
while True:
  user_input = input("user:")
  if user_input == 'q':
    break
  for chunk in graph.stream(
    {"messages": [("user", user_input)]}, config=config):
    pretty_print_messages(chunk)

i am sure i am making a mistake in implementation, maybe i am missing some information, however, later i want to implement more nodes, and each nodes should be triggered after user completes a lesson being taught by the node.

2 Answers 2

0

You do have a space in this node_id goto parameter?

return Command(goto="teach _ai"
Sign up to request clarification or add additional context in comments.

Comments

0

There is a space in teach _ai. This must exactly match the node name teach_ai.

Replace this:

return Command(goto="teach _ai", update={"messages": [ai_msg, tool_msg]})

With this:

return Command(goto="teach_ai", update={"messages": [ai_msg, tool_msg]})

Secondly, you’ve defined nodes, but not all transitions are connected. You should connect possible paths in the graph:

builder.add_edge("chathandler", "teach_ai")
builder.add_edge("teach_ai", "chathandler")

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.