Author: Haidong Ji

  • Productive development using VS Code Continue extension with local LLMs

    In last blog post, I talked about running AI LLM models locally. Read that first if you want to follow along. With local LLMs in place and running, I searched for ways to integrate them with VS Code. The answer is Continue extension. Set up: In my limited experience so far, I was very impressed.…

  • Run AI models locally with web interface

    I recently set up ollama on my 7-year old desktop (AMD Ryzen 7 1700 8 core, 32GB RAM) with an equally old NVidia GPU (GeForce GTX 1070 8GB VRAM). I was able to run llama3.1:8b successfully via terminal CLI. I then configured Open Web UI, which gives me a friendly UI to work with the…

  • Blog reading using Miniflux

    I used to host my own blog reader running TT-RSS. I’ve recently switched to Miniflux. I highly recommend it. Advantages of Miniflux over TT-RSS My setup I ran it using docker-compose, along with ol’ Apache web server. Here is a sample docker-compose.yml file: Here is a sample Apache2 site conf. I use Let’s Encrypt to…

  • Switch between different k8s clusters using zsh

    In my last post, I shared my method of switching between different aws account using oh-my-zsh. Often times, many DevOps and cloud administrators also manage multiple Kubernetes clusters. Just like knowing which aws profile you are using on a terminal, it is important to know which k8s cluster context you are under. This helps to…

  • Switch between different aws accounts using zsh

    It is pretty common for a company to have multiple aws accounts. A cloud administrator may find him/her self needing to switch accounts regularly. When doing so, s/he also needs to be aware which account’s profile s/he is under. Fortunately, zsh running on MacOS, Linux, or Windows WSL can assist! Assumptions: Create config file for…