1

Here's my code:

import pickle, os
from langchain_openai.chat_models import ChatOpenAI
from langchain.schema import (
    AIMessage,
    HumanMessage,
    SystemMessage
)

def execute_prompt(text, history, jarvis_setup):
    print(f"You said: {text}")
    history.append(HumanMessage(content = text))
    response = jarvis_setup(history)
    history.append(AIMessage(content = response.content))
    with open('JarvisMemory.txt', 'wb') as file:
        pickle.dump(history, file)
        
    print(response.content)

def main():
    jarvis_setup = ChatOpenAI(openai_api_key="API_KEY", model = "gpt-3.5-turbo", temperature = 0.7, max_tokens = 400)
    #history = [SystemMessage(content="You are a human-like virtual assistant named Jarvis.", additional_kwargs={})]
    if os.path.exists("JarvisMemory.txt"):
        with open("JarvisMemory.txt", "rb") as file:
            history = pickle.load(file)
    else:
        with open("JarvisMemory.txt", "wb") as file:
            history = [SystemMessage(content="You are a human-like virtual assistant named Jarvis. Answer all questions as shortly as possible, unless a longer, more detailed response is requested.", additional_kwargs={})]
            pickle.dump(history, file)
    
    while True:
        print("\n")
        print("Enter prompt.")
        text = input().lower()
        print("Prompt sent.")
    
        if text:
            execute_prompt(text, history, jarvis_setup)
                        
        else:
            print("No prompt given.")
            continue
                    
if __name__ == "__main__":
    main()

And I get this error:

LangChainDeprecationWarning: The method BaseChatModel.__call__ was deprecated in langchain-core 0.1.7 and will be removed in 0.3.0. Use invoke instead. warn_deprecated( Traceback (most recent call last): File "C:\Users\maste\Documents\Coding\Python\Jarvis\JarvisTextInpuhjhjghyjvjt.py", line 44, in main() File "C:\Users\maste\Documents\Coding\Python\Jarvis\JarvisTextInpuhjhjghyjvjt.py", line 37, in main execute_prompt(text, history, jarvis_setup) File "C:\Users\maste\Documents\Coding\Python\Jarvis\JarvisTextInpuhjhjghyjvjt.py", line 12, in execute_prompt response = jarvis_setup(history) File "C:\Users\maste\AppData\Roaming\Python\Python310\site-packages\langchain_core_api\deprecation.py", line 148, in warning_emitting_wrapper return wrapped(*args, **kwargs) File "C:\Users\maste\AppData\Roaming\Python\Python310\site-packages\langchain_core\language_models\chat_models.py", line 847, in call generation = self.generate( File "C:\Users\maste\AppData\Roaming\Python\Python310\site-packages\langchain_core\language_models\chat_models.py", line 456, in generate raise e File "C:\Users\maste\AppData\Roaming\Python\Python310\site-packages\langchain_core\language_models\chat_models.py", line 446, in generate self._generate_with_cache( File "C:\Users\maste\AppData\Roaming\Python\Python310\site-packages\langchain_core\language_models\chat_models.py", line 671, in _generate_with_cache result = self._generate( File "C:\Users\maste\AppData\Roaming\Python\Python310\site-packages\langchain_openai\chat_models\base.py", line 520, in _generate message_dicts, params = self._create_message_dicts(messages, stop) File "C:\Users\maste\AppData\Roaming\Python\Python310\site-packages\langchain_openai\chat_models\base.py", line 533, in _create_message_dicts message_dicts = [_convert_message_to_dict(m) for m in messages] File "C:\Users\maste\AppData\Roaming\Python\Python310\site-packages\langchain_openai\chat_models\base.py", line 533, in message_dicts = [_convert_message_to_dict(m) for m in messages] File "C:\Users\maste\AppData\Roaming\Python\Python310\site-packages\langchain_openai\chat_models\base.py", line 182, in _convert_message_to_dict if (name := message.name or message.additional_kwargs.get("name")) is not None: AttributeError: 'SystemMessage' object has no attribute 'name'

I'm guessing I need to add ".invoke" somewhere in the code based on some research I did on the issue, but I'm a beginner.

I found this website showcasing a very similar error and how to fix it: https://wikidocs.net/235780 You can translate the page to English with Google Translate and the translations are sufficient to understand. It says to add ".invoke" in the place you can see shown on the website. Not sure how to implement this into my code though. Also, this might not be the right solution.

I also looked at the Langchain website and it also says to use "invoke" but I can't find examples of it being used in a full line of code.

2 Answers 2

3

Here's the solution! I just figured it out. Very simple mistake! When changing the langchain_community to langchain_openai, remove the ".chat_models"! That's all it was!

So this line: from langchain_community.chat_models import ChatOpenAI

Should be this: from langchain_openai import ChatOpenAI

This is how I figured it out: https://python.langchain.com/v0.2/docs/versions/v0_2/#upgrade-to-new-imports

Also, at least in my code, I had to add ".invoke" after jarvis_setup here: response = jarvis_setup(history)

With those two changes, I get no warnings and no errors!

Sign up to request clarification or add additional context in comments.

Comments

1

May be this can help you.

first you need to create prompt then llm and then use LCEL and finally create a runable, which can then be used for invoking or streaming.

from langchain_openai.chat_models import ChatOpenAI
from langchain_core.prompts import ChatPromptTemplate
from langchain_core.messages import HumanMessage, AIMessage
from langchain_core.runnables import RunnablePassthrough
from langchain_core.output_parsers import StrOutputParser

TEMPLATE="""Your Prompt here

chat_history: {chat_history}
question: {question}
"""

example_chat_history= [
{
   "human": "Hi there",
   "ai": "Hello! how I can help you?"
}
]
class ChatBot:

    def __init__(self, chat_history: List[Dict]):
        self.chat_history = self.create_chat_history(chat_history=chat_history)
        self.llm = ChatOpenAI(model="gpt-4o", temperature=0)



    def create_chat_history(self, chat_history):
        return [(HumanMessage(content=message["human"]), AIMessage(content=message["ai"])) for message in
                             chat_history]

    def create_chat_history_message(self, message, result):
        return self.chat_history.extend([HumanMessage(content=message), result])

    def rephrase_question(self, query):
        prompt = ChatPromptTemplate.from_template(TEMPLATE)
        runnable = (
                prompt
                | self.llm
                | StrOutputParser()
        )

        return runnable.invoke({"question": query, "chat_history": self.chat_history[-5:]},
                               {"callbacks": []})

3 Comments

If I change the import to "from langchain_community.chat_models import ChatOpenAI", the code works fine, but I get a deprecation warning. LangChainDeprecationWarning: The class ChatOpenAI was deprecated in LangChain 0.0.10 and will be removed in 0.3.0. An updated version of the class exists in the langchain-openai package and should be used instead. To use it run pip install -U langchain-openai and import as from langchain_openai import ChatOpenAI. warn_deprecated(
if you update your versions to the latest ones that warning will go away.
Which langchain modules?

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.