Skip to main content
POST
/
api
/
v3
/
answer-engine
/
{engine_id}
/
chat
Chat
curl --request POST \
  --url https://api.nugen.in/api/v3/answer-engine/{engine_id}/chat \
  --header 'Authorization: Bearer <token>' \
  --header 'Content-Type: application/json' \
  --data '
{
  "query": "<string>",
  "model_llm": "llama-v3p3-70b-instruct",
  "mode": "vanilla",
  "top_k": 5,
  "conv_thread_id": "<string>",
  "prompt_id": "<string>"
}
'
{
  "answer_engine_id": "<string>",
  "query": "<string>",
  "answer": "<string>",
  "conv_thread_id": "<string>",
  "query_id": "<string>",
  "model_llm": "<string>",
  "metadata": {},
  "rewritten_query": "<string>",
  "relevance_score": 123,
  "token_usage": {
    "input_tokens": 123,
    "output_tokens": 123,
    "total_tokens": 123
  },
  "prompt_id": "<string>",
  "prompt_name": "<string>"
}

Authorizations

Authorization
string
header
required

Bearer authentication header of the form Bearer <token>, where <token> is your auth token.

Path Parameters

engine_id
string
required

Unique answer engine identifier

Body

application/json

Request body for chatting with an answer engine using RAG.

query
string
required

User's question

model_llm
string
default:llama-v3p3-70b-instruct

LLM model for generation

mode
string
default:vanilla

Inference mode: vanilla or agentic

top_k
integer
default:5

Number of context chunks to use (1–30)

Required range: 1 <= x <= 30
conv_thread_id
string | null

Thread ID for multi-turn conversation

prompt_id
string | null

Prompt ID to use for generation

Response

Returns the generated answer along with supporting document chunks and token usage statistics

Response containing the generated answer, thread ID, and token usage for a chat request.

answer_engine_id
string
required
query
string
required
answer
string
required
conv_thread_id
string | null
query_id
string | null
model_llm
string | null
metadata
Metadata · object
rewritten_query
string | null
relevance_score
number | null
token_usage
TokenUsage · object

Token consumption breakdown for a single chat request.

prompt_id
string | null
prompt_name
string | null