Reputation: 7127
I have the following LangGraph code. I can't seem to integrate it with FastAPI correctly. I want to send to the graph an input which is defined in the inp
function, pass it through a langgraph workflow and then return the output in FastAPI.
I can go to the playground and input some text: http://0.0.0.0:5043/generate/playground/
I get the following output:
{
"node_1": "{'question': ['helo']} Hi ",
"node_2": "{'question': ['helo']} Hi there"
}
However, when I go to the docs http://0.0.0.0:5043/docs - I click "Try it out" and I can't see where I can input anything.
The overall objective is to pass a HTML search box, pass it through langgraph and display a HTML output.
Additionally, I am a little unsure on the code since I am using add_routes
but in the documentation They use async
functions and app.get
. How can I replace the add_routes and define the pages using app.get/app.post
?
@app.get("/items/{item_id}")
async def read_item(item_id):
return {"item_id": item_id}
Code:
from langgraph.graph import Graph
from fastapi import FastAPI
import uvicorn
from langserve import add_routes
from langchain_core.runnables import RunnableLambda
workflow = Graph()
def function_1(input_1):
return str(input_1) + " Hi "
def function_2(input_2):
return input_2 + "there"
workflow = Graph()
workflow.add_node("node_1", function_1)
workflow.add_node("node_2", function_2)
workflow.add_edge('node_1', 'node_2')
workflow.set_entry_point("node_1")
workflow.set_finish_point("node_2")
app_graph = workflow.compile()
def inp(question: str) -> dict:
return {"question": list({question})}
################################################################################
def out(value: dict):
result = value
return result
final_chain = RunnableLambda(inp) | app_graph | RunnableLambda(out)
fastapi_app = FastAPI()
add_routes(fastapi_app, final_chain, path = "/generate")
if __name__ == "__main__":
import uvicorn
uvicorn.run(fastapi_app, host="0.0.0.0", port=5043)
# # http://localhost:5043/openai/playground/
################################################################################
Upvotes: 0
Views: 1943
Reputation: 312410
However, when I go to the docs http://0.0.0.0:5043/docs - I click "Try it out" and I can't see where I can input anything.
If you want your route to accept input, you need to define a function that accepts either query parameters or a request body or form data.
Here's an example api that simulates a "Magic 8-ball"; it accepts either query parameters on a GET
request or a JSON request body on a POST
request:
import random
import fastapi
import pydantic
class Question(pydantic.BaseModel):
question: str
class Answer(pydantic.BaseModel):
question: Question
answer: str
Answers = [
"It is certain",
"It is decidedly so",
"Without a doubt",
"Yes definitely",
"You may rely on it",
"As I see it, yes",
"Most likely",
"Outlook good",
"Yes",
"Signs point to yes",
"Reply hazy, try again",
"Ask again later",
"Better not tell you now",
"Cannot predict now",
"Concentrate and ask again",
"Don't count on it",
"My reply is no",
"My sources say no",
"Outlook not so good",
"Very doubtful",
]
app = fastapi.FastAPI()
@app.post("/generate")
def generate(question: Question) -> Answer:
answer = random.choice(Answers)
return Answer(question=question, answer=answer)
@app.get("/generate")
def generate_get(question: str) -> Answer:
answer = random.choice(Answers)
return Answer(question=Question(question=question), answer=answer)
If we visit /docs
for this application and click the "Try it out" button for our /generate
endpoint, we see that there is now a "request body" field:
If we enter a question (e.g., "Is this a good answer?") and click "Execute", we get a response (and an example curl
cli):
Using curl
, we can demonstrate a POST
request:
curl -X 'POST' 'http://localhost:8000/generate' \
-H content-type:application/json \
-d '{"question": "Is this code working?"}'
And a GET
request:
curl 'http://localhost:8000/generate?question=Is+this+code+working%3F'
To turn this into a web application with a user-facing interface, you would need to create the necessary HTML and form and possibly some javascript to handle the form submission and response.
If you weren't looking for an API, you could do something similar using form data and have fastapi directly return an HTML response.
I am finding it difficult how I can pass the final_chain langgraph
I am completely unfamiliar with the modules you are working with (langgraph
, langserve
, etc), but taking a very brief look at the docs, maybe something like this?
import fastapi
import pydantic
from langgraph.graph import Graph
from langserve import add_routes
from langchain_core.runnables import RunnableLambda
class Question(pydantic.BaseModel):
question: str
class Answer(pydantic.BaseModel):
question: Question
answer: str
def function_1(input_1):
return str(input_1) + " Hi "
def function_2(input_2):
return input_2 + "there"
workflow = Graph()
workflow.add_node("node_1", function_1)
workflow.add_node("node_2", function_2)
workflow.add_edge("node_1", "node_2")
workflow.set_entry_point("node_1")
workflow.set_finish_point("node_2")
app_graph = workflow.compile()
def inp(question: str) -> dict:
return {"question": list({question})}
def out(value: dict):
result = value
return result
final_chain = RunnableLambda(inp) | app_graph | RunnableLambda(out)
app = fastapi.FastAPI()
@app.post("/generate")
def generate(question: Question) -> Answer:
answer = final_chain.invoke(question.question)
return Answer(question=question, answer=answer)
If we call the above endpoint like this:
curl -X 'POST' 'http://localhost:8000/generate' \
-H content-type:application/json \
-d '{"question": "Is this code working?"}'
We get the response:
{
"question": {
"question": "Is this code working?"
},
"answer": "{'question': ['Is this code working?']} Hi there"
}
Upvotes: 1