I am using Langchain's expression language to create a retriever. The goal is to create a function where the AI checks the response to a previous interaction.
Therefore, I need to give this function two inputs: the original question and the answer. This means I want to pass two parameters to it: 'question' and 'answer.'
Currently, I can only pass the 'question' as shown below. How can I also pass the 'answer' into the prompt?"
from langchain_community.vectorstores import DocArrayInMemorySearchfrom langchain_core.output_parsers import StrOutputParserfrom langchain_core.prompts import ChatPromptTemplatefrom langchain_core.runnables import RunnableParallel, RunnablePassthroughregion = "usa"vector_databases_path = "db"k = 10model_name = "gpt-4"reteriver_chain_type= "stuff"input = "Albuquerque"answer = ' MetroArea State\n0 Albuquerque NM'####################### Get the Retriever ###################******************** Set Embeddings ***********************embeddings = OpenAIEmbeddings()#******************** Load Database Embeddings ***********************docsearch = FAISS.load_local(os.path.join(os.getcwd(), vector_databases_path, "MSA_State_faiss_index"), embeddings)#******************** Set Up the Retriever ***********************retriever = docsearch.as_retriever(search_type="similarity", search_kwargs={"k": k})prompt_template = """ Use the following pieces of context to check the AI's Answer. {context} Job: Your job is to check the MetroArea and State code based on what the AI has returned. Here is the original input: {question} Here is the AI's answer: {answer} Report your answer as a CSV. Where the column names are AI_Correct, AI_Correct: Return TRUE if the AI was correct or FALSE if it is incorrect. Each variable after the comma should be one row. If there is no comma between them then treat input as one row. If you do not know the answer, then return: IDK"""####################### Initialize LLM ##################llm = ChatOpenAI( model_name=model_name, temperature=0.0)#How do I change this code to allow for an additional parameter, answer, to pass through it to the prompt?prompt = ChatPromptTemplate.from_template(prompt_template)output_parser = StrOutputParser()setup_and_retrieval = RunnableParallel( {"context": retriever, "question": RunnablePassthrough()})chain = setup_and_retrieval | prompt | llm | output_parserfinal_answer = chain.invoke(input)final_answer