Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Dev branch local ollama suggestion #673

Open
Super-six-java opened this issue Aug 9, 2024 · 0 comments
Open

Dev branch local ollama suggestion #673

Super-six-java opened this issue Aug 9, 2024 · 0 comments
Assignees
Labels
question Further information is requested

Comments

@Super-six-java
Copy link

hi,bro
When I run the dev branch using the local ollama qwen2:72b model, should I use OllamaFunctions when loading the ollama model with the get_llm() method in the llm.py file?

    elif "ollama" in model_version:
        model_name, base_url = env_value.split(",")
        llm = OllamaFunctions(base_url=base_url, model=model_name)

The OllamaFunctions class inherits from the ChatOllama class and looks like has the with_stuctured_output() , but there is another issue where with_stuctured_output() seems to only output in JSON format.
If you use this method, you also need to change the schema_extraction_from_text() method in the schema_extraction.py file , for example

    prompt = ChatPromptTemplate.from_messages(
    [("system", schema_prompt), ("user", "{text}")]
    )
    

    if llm.get_name().startswith("Ollama"):
        runnable = prompt | llm.with_structured_output(schema=Schema, include_raw=False)
        
    else:

        runnable = prompt | llm.with_structured_output(
            schema=Schema,
            method="function_calling",
            include_raw=False,
        )

and change the get_graph_document_list() method in the llm.py file

def get_graph_document_list(
    llm, combined_chunk_document_list, allowedNodes, allowedRelationship
):
    futures = []
    graph_document_list = []
    # if llm.get_name() == "ChatOllama":
    if llm.get_name().startswith("Ollama") :
        node_properties = False
    else:
        node_properties = ["description"]
@kartikpersistent kartikpersistent added the question Further information is requested label Aug 9, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
question Further information is requested
Projects
None yet
Development

No branches or pull requests

3 participants