cancel
Showing results for 
Search instead for 
Did you mean: 

GenAI Hub SDK, No deployment found with: deployment.model_name when using gpt-4o or gpt-4o-mini

ZaynabR
Associate
Associate
0 Kudos
77

Hi

I am using the GenAI Hub SDK. I am using the init_llm function to initialize the llm. It works for certain models such as gpt-35 and gpt-4, however when I use gpt-4o or gpt-4o-mini, I get the following error:

'No deployment found with: deployment.model_name == gpt-4o'

The exact code I am using:

from langchain_core.output_parsers import StrOutputParser
from langchain.prompts import PromptTemplate
from gen_ai_hub.proxy.langchain.init_models import init_llm
llm = init_llm('gpt-4o', max_tokens=4500, temperature=0)

Accepted Solutions (0)

Answers (0)