May Edition
Hello!
Hope you all have been doing well and found the previous newsletter engaging! This month, my interests went beyond the usual cloud and DevOps stuff to checking out some AI things as well. I'm excited to share those with you in this edition of the newsletter.
Running LLMs Locally!
I don’t know why I slept on this for so long, thinking it would be too complex. Believe me when I say, running an LLM model locally is SUPER EASY with ollama. Take it from someone who has absolutely no knowledge about the dev side of things when it comes to AI, but all you gotta do is:
Download ollama from here: https://ollama.com/
Run
ollama run llama3
in your terminal
And you would have an AI chatbot running locally!
If you would like to give it a nice UI as well to make it like ChatGPT, there are a lot of options available listed here. I tried Campbell Chatbot built by Ramiro Berrelleza which was easy to get started with and worked well for me!
Suggested Reads
Now back to some DevOps stuff. I found this article by Mathieu Larose on building a SOC 2 compliant GitOps CI/CD Pipeline with GitHub Actions really interesting! They cover how you can easily achieve a robust CI/CD workflow using two repos, one for your application code and one for infra, with just GitHub Actions.
If that is still a bit complex workflow for you, I wrote a beginner-friendly version of achieving something similar using Argo CD and GitHub Actions, so you might want to check that out first!
Dagger Has Me Excited!
These past couple of days I’ve been working on a Dagger module for Okteto, and that’s been pretty fun. For those of you who aren’t familiar with Dagger, it’s a really cool project which allows you to write your CI/CD pipelines in the programming language of your choice and run them in containers. The benefit of this is that:
Now that things are in code, they’re standardized and much easier to maintain than hacky bash scripts or other solutions you might be using.
Since these pipelines can be run in containers, it means you can run them locally as well! This I feel is super useful in enabling developers to be able to iterate fast instead of the traditional flow of always having to push code and then wait to see if CI fails or not. Now developers can run these pipelines pre-push in local containers.
You can run your first Dagger function locally by just running these two commands:
$ brew install dagger/tap/dagger
$ dagger -m github.com/shykes/daggerverse/hello@v0.1.2 call hello
This was it for the May edition of my newsletter. I hope you found it useful and it inspires you to try some of the things I mentioned.
Thanks for reading!