Quantcast
Channel: Active questions tagged python - Stack Overflow
Viewing all articles
Browse latest Browse all 16803

how to invoke rag chain with two input?

$
0
0

I'm trying to invoke the rag chain with two input variable but I'm getting error.

def get_data_from_llm(raw_text):    # Initialize the LLM and create a chain    llm = Together(        model="mistralai/Mixtral-8x7B-Instruct-v0.1",        temperature=0.7,        max_tokens=2048,        top_k=1,        together_api_key="api_key_here",    )    # chain = LLMChain(llm=llm, prompt=prompt_template,          output_parser=JsonOutputParser(pydantic_object=InvoiceJSON))    json_parser = SimpleJsonOutputParser(pydantic_object=InvoiceJSON)    prompt_template = PromptTemplate(        input_variables=["raw_text"],        template="'raw_text': {raw_text},\n'json_structure': {json_structure}",        partial_variables={"json_structure": json_parser.get_format_instructions()},    )    few_shot_prompt = FewShotPromptTemplate(        examples=few_shot_examples,        example_prompt=prompt_template,        prefix=json_prefix,        suffix=suffix,        input_variables=['json_structure']    )    # Run the chain with the raw text    chain = {"raw_text": RunnablePassthrough(), "json_structure": RunnableLambda(get_structure)} | few_shot_prompt | llm #| json_parser    chain_data = chain.invoke(raw_text)    print(chain_data)    return chain_data

for the above code I'm getting this error below

Traceback (most recent call last):  File "D:\my_scripts\Invoice-Data-Extraction-Bot-using-LLAMA-2-and-Streamlit\utils.py", line 136, in <module>    print(create_docs('Deccan Sources - 1173 - 12.12.22_12122022113240_15122022184846.pdf'))  File "D:\my_scripts\Invoice-Data-Extraction-Bot-using-LLAMA-2-and-Streamlit\utils.py", line 116, in create_docs    llm_extracted_data = get_data_from_llm(raw_data)  File "D:\my_scripts\Invoice-Data-Extraction-Bot-using-LLAMA-2-and-Streamlit\examples.py", line 127, in get_data_from_llm    chain_data = chain.invoke(raw_text)  File "D:\my_scripts\Invoice-Data-Extraction-Bot-using-LLAMA-2-and-Streamlit\virenv\lib\site-packages\langchain_core\runnables\base.py", line 2493, in invoke    input = step.invoke(  File "D:\my_scripts\Invoice-Data-Extraction-Bot-using-LLAMA-2-and-Streamlit\virenv\lib\site-packages\langchain_core\prompts\base.py", line 128, in invoke    return self._call_with_config(  File "D:\my_scripts\Invoice-Data-Extraction-Bot-using-LLAMA-2-and-Streamlit\virenv\lib\site-packages\langchain_core\runnables\base.py", line 1620, in _call_with_config    context.run(  File "D:\my_scripts\Invoice-Data-Extraction-Bot-using-LLAMA-2-and-Streamlit\virenv\lib\site-packages\langchain_core\runnables\config.py", line 347, in call_func_with_variable_args    return func(input, **kwargs)  # type: ignore[call-arg]  File "D:\my_scripts\Invoice-Data-Extraction-Bot-using-LLAMA-2-and-Streamlit\virenv\lib\site-packages\langchain_core\prompts\base.py", line 112, in _format_prompt_with_error_handling    return self.format_prompt(**_inner_input)  File "D:\my_scripts\Invoice-Data-Extraction-Bot-using-LLAMA-2-and-Streamlit\virenv\lib\site-packages\langchain_core\prompts\string.py", line 229, in format_prompt    return StringPromptValue(text=self.format(**kwargs))  File "D:\my_scripts\Invoice-Data-Extraction-Bot-using-LLAMA-2-and-Streamlit\virenv\lib\site-packages\langchain_core\prompts\few_shot.py", line 165, in format    return DEFAULT_FORMATTER_MAPPING[self.template_format](template, **kwargs)  File "C:\Program Files\Python310\lib\string.py", line 161, in format    return self.vformat(format_string, args, kwargs)  File "D:\my_scripts\Invoice-Data-Extraction-Bot-using-LLAMA-2-and-Streamlit\virenv\lib\site-packages\langchain_core\utils\formatting.py", line 19, in vformat    return super().vformat(format_string, args, kwargs)  File "C:\Program Files\Python310\lib\string.py", line 165, in vformat    result, _ = self._vformat(format_string, args, kwargs, used_args, 2)  File "C:\Program Files\Python310\lib\string.py", line 205, in _vformat    obj, arg_used = self.get_field(field_name, args, kwargs)  File "C:\Program Files\Python310\lib\string.py", line 270, in get_field    obj = self.get_value(first, args, kwargs)  File "C:\Program Files\Python310\lib\string.py", line 227, in get_value    return kwargs[key]KeyError: '"properties"'

I have tried two different way but getting almost same error in other cases also.

1.

chain = few_shot_prompt | llmchain_data = chain.invoke({{"raw_text": raw_text, "json_structure": json_structure}print(chain_data)
chain = {"raw_text": RunnablePassthrough(), "json_structure": json_structure} | few_shot_prompt | llm  chain_data = chain.invoke(raw_text)print(chain_data)

Can anyone suggest what I'm doing wrong here or if have any other efficient way to do it let me know?


Viewing all articles
Browse latest Browse all 16803

Trending Articles



<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>