Inventor(s)

Anupam Mukherjee

Abstract

Large Language Model-based (LLM-based) fulfilment, while powerful, presents several issues or limitations. These include scalability and performance challenges for handling a large set of functions, contextual and function ambiguity, maintenance complexity, debugging difficulties, and security risks due to role-based access control (RBAC) restrictions. Current approaches have not adequately addressed these limitations. However, an innovative approach is proposed herein to effectively tackles these issues, offering a more robust solution by employing a function vector database (DB) alongside a Retrieval Augmented Generation (RAG) retrieval mechanism that can be used to determine the most suitable function to be called, outside of an LLM. This approach effectively mitigates various challenges that are otherwise considered to be the most formidable obstacles in traditional LLM function calling approaches.

Creative Commons License

Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 License.

Share

COinS