Skip to content

Latest commit

 

History

History
33 lines (22 loc) · 787 Bytes

README.md

File metadata and controls

33 lines (22 loc) · 787 Bytes

Self Evaluating LLM with LangGraph

Getting Started

Example requires a locally running Ollama instance with mistral available.

# assumes ollama is already running
# clone the repo
git clone [email protected]:CuriouslyCory/self-evaluating-langgraph.git
cd self-evaluating-langgraph

# install dependencies
npm i

# run local development server
npm run dev

Then Browse to http://localhost:3000/ to see the app.

You can change the model from mistral to your choice in src/server/api/routers/post.ts.

About

This is a T3 Stack project bootstrapped with create-t3-app. Also proudly using

  • shadcn/ui for the UI primatives.
  • LangChain
  • LangGraph