Getting ValueError When Using RunnableLambda with LLM in LangChain #31269
Replies: 5 comments 4 replies
-
The
from langchain_openai import AzureChatOpenAI
from langchain_core.runnables import RunnableLambda, RunnablePassthrough
from langchain_core.prompts import ChatPromptTemplate
from langchain_core.output_parsers import StrOutputParser
import os
# Dummy prompt for SQL generation
final_prompt = ChatPromptTemplate.from_messages([
("system", "You are a helpful assistant"),
("human", "{question}")
])
llm_az = RunnableLambda(lambda x: None) # Placeholder for LLM
def get_llm_from_state(state):
global llm
if llm is None:
llm = AzureChatOpenAI(
model="gpt-35-turbo",
deployment_name="gpt-35-turbo",
openai_api_version="2023-05-15",
azure_endpoint=os.getenv("AZURE_OPENAI_ENDPOINT"),
openai_api_key=os.getenv("AZURE_OPENAI_KEY")
)
return llm
sql_chain = (
RunnablePassthrough.assign(
llm=RunnableLambda(lambda x: get_llm_from_state(x))
)
| final_prompt
| llm_az
)
# Sample input
formatted_input = final_prompt.invoke({"question": "What is 2+2?"})
result = llm_az.invoke(formatted_input)
print(result) In this updated code, the To continue talking to Dosu, mention @dosu. Help Dosu learn! Give it feedback: Great Response | Irrelevant Answer | Incorrect Sources | Too Verbose | Hallucination | Bug Report | Other |
Beta Was this translation helpful? Give feedback.
-
I want to use the sql as an input to another chain latter, how to format the sql_chain? |
Beta Was this translation helpful? Give feedback.
-
I want to use the sql_chain in the following chain:
|
Beta Was this translation helpful? Give feedback.
-
I am getting the following error in the sql_chain
|
Beta Was this translation helpful? Give feedback.
-
my final_prompt looks like:
|
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
I'm trying to dynamically initialize an LLM using RunnableLambda and pass it into a chain. I want to be able to get the deployment credentials for llm either by the user of from the environment file. But I am getting the below error:
ValueError: Invalid input type <class 'dict'>. Must be a PromptValue, str, or list of BaseMessages.
Minimal Reproducible Code
Why am I getting this error?
I believe it's happening because the prompt isn't formatted correctly before being passed to .invoke(), but I'm not sure how to fix it.
Any idea what I'm doing wrong? How can I safely pass a dynamically initialized LLM into a LangChain runnable chain?
Beta Was this translation helpful? Give feedback.
All reactions