<?xml version="1.0" encoding="utf-8" standalone="yes"?><rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:content="http://purl.org/rss/1.0/modules/content/"><channel><title>Posts on The .NET + LM Sandbox</title><link>https://lmcorner.net/posts/</link><description>Recent content in Posts on The .NET + LM Sandbox</description><generator>Hugo -- 0.150.1</generator><language>en-us</language><lastBuildDate>Mon, 23 Mar 2026 23:54:03 +0100</lastBuildDate><atom:link href="https://lmcorner.net/posts/index.xml" rel="self" type="application/rss+xml"/><item><title>Local LLM. Chapter 1</title><link>https://lmcorner.net/posts/local-llm/</link><pubDate>Mon, 23 Mar 2026 23:54:03 +0100</pubDate><guid>https://lmcorner.net/posts/local-llm/</guid><description>&lt;h3 id="hello-there-"&gt;Hello there! 🖖&lt;/h3&gt;
&lt;h3 id="recap"&gt;Recap&lt;/h3&gt;
&lt;ul&gt;
&lt;li&gt;&lt;strong&gt;Tool:&lt;/strong&gt; llama.cpp&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;OS&lt;/strong&gt;: Windows&lt;/li&gt;
&lt;/ul&gt;
&lt;pre tabindex="0"&gt;&lt;code&gt;1. [Hugging Face Models](https://huggingface.co/models)
2. App: llama.cpp
3. Model: SmolVLM-500M-Instruct-GGUF
4. Win + R -&amp;gt; winget install llama.cpp
5. CMD -&amp;gt; llama-cli -hf ggml-org/SmolVLM-500M-Instruct-GGUF:Q8_0
/// Cleanup
1. winget list llama.cpp
2. winget uninstall --id ggml.llamacpp
&lt;/code&gt;&lt;/pre&gt;&lt;h2 id="step-by-step-implementation"&gt;Step-by-Step Implementation&lt;/h2&gt;
&lt;h3 id="quick-way-to-run-llamacpp-on-windows"&gt;Quick Way to Run llama.cpp on Windows&lt;/h3&gt;
&lt;ol&gt;
&lt;li&gt;Navigate to &lt;a href="https://huggingface.co/models"&gt;Hugging Face Models&lt;/a&gt;&lt;/li&gt;
&lt;/ol&gt;
&lt;ul&gt;
&lt;li&gt;&lt;strong&gt;App:&lt;/strong&gt; llama.cpp&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Model:&lt;/strong&gt; SmolVLM-500M-Instruct-GGUF
&lt;img alt="Hugging Face Model Select" loading="lazy" src="https://lmcorner.net/posts/local-llm/hf-1.png"&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;ol start="2"&gt;
&lt;li&gt;Click &amp;ldquo;Use this model&amp;rdquo; -&amp;gt; &amp;ldquo;llama.cpp&amp;rdquo;&lt;/li&gt;
&lt;/ol&gt;
&lt;ul&gt;
&lt;li&gt;A modal window will appear with instructions on how to install llama.cpp and a command to run the selected model.
&lt;img alt="Hugging Face Model App Select" loading="lazy" src="https://lmcorner.net/posts/local-llm/hf-2.png"&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;ol start="3"&gt;
&lt;li&gt;Use WinGet to install and run:&lt;/li&gt;
&lt;/ol&gt;
&lt;pre tabindex="0"&gt;&lt;code&gt;# Press Win + R and type powershell (or use Terminal/CMD)
# 1. Install llama.cpp via Windows Package Manager
winget install llama.cpp
# 2. Download and run the model directly from Hugging Face in the console
llama-cli -hf ggml-org/SmolVLM-500M-Instruct-GGUF:Q8_0
&lt;/code&gt;&lt;/pre&gt;&lt;p&gt;&lt;img alt="alt text" loading="lazy" src="https://lmcorner.net/posts/local-llm/hf-3.png"&gt;&lt;/p&gt;</description></item><item><title>Hello World!</title><link>https://lmcorner.net/posts/hello-world/</link><pubDate>Wed, 11 Mar 2026 13:01:13 +0100</pubDate><guid>https://lmcorner.net/posts/hello-world/</guid><description>&lt;h3 id="hello-there-"&gt;Hello there! 🖖&lt;/h3&gt;
&lt;p&gt;My name is Aleksandr, and I am a software engineer.&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;Why this blog?&lt;/strong&gt; Well, LLMs changed everything. To be honest, they’ve blown me away.&lt;/p&gt;
&lt;p&gt;For me, coding isn&amp;rsquo;t just a job — it&amp;rsquo;s a hobby. Over the years, I’ve felt that &amp;ldquo;wow&amp;rdquo; effect only a few times:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;When my dad took my brother and me to his lab, showed us a room full of &amp;ldquo;Corvette&amp;rdquo; computers, and launched a game from an 8″ floppy disk 😅;&lt;/li&gt;
&lt;li&gt;When I first saw Windows 95;&lt;/li&gt;
&lt;li&gt;When I created a snowman in QuickBasic following my dad’s instructions;&lt;/li&gt;
&lt;li&gt;When I wrote my first Delphi app and everything looked so &amp;ldquo;beautiful&amp;rdquo; with almost no effort;&lt;/li&gt;
&lt;li&gt;When I discovered version control systems (and the nightmare of manually archiving folders by date became a thing of the past);&lt;/li&gt;
&lt;li&gt;And last but not least, when I tried LLMs for the first time.&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;The world of LLMs and the infrastructure around them is changing so fast that I struggle to keep up&amp;hellip; In this blog, I want to share my thoughts and findings. I&amp;rsquo;m just a regular coder, not some &amp;lsquo;Holy-Logic-Batman&amp;rsquo; genius, so everything here is filtered through my perception and is by no means the ultimate truth.&lt;/p&gt;</description></item></channel></rss>