Laravel RAG — 3 Steps to Import Commits, Summarize & Create a Release Log

Alfred Nutile
4 min readJun 5, 2024

--

The required Thumbnail!

For those, like myself, who do not like to watch YouTube videos to learn something new (though I watch a ton of them 🤔) here are the steps to using LaraLamma.ai to easily make an automation that will take GitHub Webhook Events (Pull-Requests, etc) and make a Change-Log.

RAG — Retrieval augmented generation system — an architectural approach that can improve the efficacy of large language model (LLM) applications

When we are done with this I hope two things will stand out. A RAG system can lead to a ton of automations and give the non-techie users of those automations a way to use the “ease” of Prompt Writing, to drive the results. I say put “ease” in quotes be cause I see more and more this is a skill that takes time to get good at.

Step 1 — Making the Collection

In Laralamma.ai you can easily make a collection to centralize this particular data.

Create Collection

It is here we will start to bring in data. This becomes our corpus of information for the LLM.

One thing I keep reminding people is the LLM will not hallucinate if you give it a good prompt — confine it to the right context — and set the temperature to 0.

One example result to show this is below:

Example of NOT hallucinating

Here you will see it stuck to the rules.

Step 2 — The Source

This term in LaraLamma just means “getting data in”. It can be emails, uploaded files, and in this case a webhook (API POST Request)

Here you can see two things to start. One, the title just hints at how we can send different repos to the same Collection. Since this collection will represent this Clients/Projects collection of repositories. And we can add numerous Sources.

Second, and to me this is the most important. A user (although technical in this example) can parse JSON or other data using a plain language Prompt.

LaraLamma will be introducing tools to help build prompts, save templates, add tokens (like a mail merge) etc

In this case we pull out some data from the JSON to make a simple Message of commit messages that might come in at this time.

Once the Source(s) come in the system will break them up and vecotrize them like the following.

Yes my commit message could be better!

Step 3 — Output

Now how do we get the data out?

We make an “Output” (naming is hard 🙁)

These Outputs can hit another API, or Email, or make a webpage etc. Right now lets just make one that emails us a summary of commits.

You can have more than one Output btw

In the above example the user can write a Prompt (choosing from a template if they want) and then this will have the LLM build from the collections content what they want.

In this case in the Output they chose is Email. So it will email the results of the query (eg select all items in this collection since the last time this output ran) and that “context” will be passed into the prompt.

That is it, the user will now get an email that is a summary of the commit log!

Email

Of course my commit messages could be better and someone could make an Output that creates a release on GitHub!

But in the meantime it shows how easy it is to have your RAG system automate tasks. Another recent use case was marketing emails that all got sent to a collection and a report was generated to help surface leads to the right people. Another customer used it for customer feedback!

Anyways some links are below, hope you start to see how useful LaraLamma.ai can be!

Links

--

--

Alfred Nutile
Alfred Nutile

Written by Alfred Nutile

LaraLlama.io Open-Source Product Created Day to day developer using Laravel, PHP, Vue.js, Inertia

No responses yet