It contains a production grade implementation including DEPLOYMENT code with CDK and a CI/CD pipeline, testing, observability and more (see Features section). Choose the architecture that you see fit, ...
LOCAL-LLM-SERVER (LLS) is an application that can run open-source LLM models on your local machine. It provides you an OpenAI-Compatible completation API, along with a command-line based Chatbot ...