Part 3 in a series or 3 (maybe more)
This article demonstrates how to use the OpenAi Query Function to tag articles using the LLM. The author introduces a real-world example from their Open-Source project, LLMAssistant. The goal is to eventually automate more tasks, like organizing tasks using the Bullet Journal methodology. However, there are risks associated with SQL generation, as highlighted by OpenAi. The author also briefly touches on two functions they’ve implemented: “Get Existing Tags Function” and “Tag Article Function.”
Following this post here, I want to show how to use the OpenAi Query Function to query the database. This example will use tags, and the LLM will be tasked with using existing tags to choose 1–3 that seem to fit the article. It will then add 1–2 more that might be relevant.
This is a straightforward but real example I’ve begun to use in my LLMAssistant project here, which is Open-Source.
In the future, I aim to have this system handle more of the “code,” such as inserting examples of those tags. Ideally, I’d like the LLM to help me organize my week and month following the Bullet Journal methodology. While this might seem counterintuitive to the purpose of Bullet Journaling, it’s an experiment I’m keen to try. Overall it really is a practice in how and when to use the LLM over just coding it myself.
As the OpenAi Cook Book link warns, “SQL generation can be high-risk in a production environment since models are not perfectly reliable at generating correct SQL.” This is an essential consideration. Currently, LLMAssistant is a solo project, but I’m considering asking the LLM to use ORM-based queries instead of raw ones, given its familiarity with Eloquent.
I’ve discussed how to add functions to your application in my previous two articles,
so I won’t delve into that here. Instead, I’ll introduce the following two functions:
Get Existing Tags Function
This function is simple enough for me to code manually, but I’m keen on leveraging the LLM more and reducing manual coding. Perhaps in the future, the query could be tailored to the content, and using a vector-based dataset, it could identify or even create relevant tags.
As noted in the previous articles for better or worse I make “helper” functions in Laravel then connect them to the real Call and pass in a consistant Data Object and return the model Message. This allows me to chain these functions together. See the `app/helpers.php` to trace that code down.
Tag Article Function
This function guides the LLM on how to format the data the system requires. The system then handles the rest, such as finding or creating tags. Once again, this showcases the LLM’s capability to format data as needed for parameters.
My daily routine involves sending emails to the assistant. These emails contain links. As mentioned in previous articles, the assistant uses the “get_content_from_url” function. Now, with the following prompt:
It hints to the LLM to use the attached functions for the article I send.
The outcome is the tagged article shown below:
The “Reply” section provides insights into the reasons behind the chosen tags:
Follow Up Ideas
Transition to Eloquent-formatted queries.
OpenAi API Cook book
LLM Assistant Github Repo
Previous Article Showing how I make functions