<?xml version="1.0" encoding="UTF-8" ?>
<rss version="2.0">
    <channel>
      <title>Leo&#039;s Notebook</title>
      <link>https://garden.azl.au</link>
      <description>Last 10 notes on Leo&#039;s Notebook</description>
      <generator>Quartz -- quartz.jzhao.xyz</generator>
      <item>
    <title>2026-04-25</title>
    <link>https://garden.azl.au/maths/2026-04-25</link>
    <guid>https://garden.azl.au/maths/2026-04-25</guid>
    <description><![CDATA[  ]]></description>
    <pubDate>Fri, 24 Apr 2026 23:00:00 GMT</pubDate>
  </item><item>
    <title>Kernels</title>
    <link>https://garden.azl.au/ai/kernels</link>
    <guid>https://garden.azl.au/ai/kernels</guid>
    <description><![CDATA[ The word "kernel" is wildly overloaded. Math kernels, CUDA kernels, attention kernels — what they share and what they don't. ]]></description>
    <pubDate>Fri, 24 Apr 2026 00:00:00 GMT</pubDate>
  </item><item>
    <title>DishBrain - on sentience</title>
    <link>https://garden.azl.au/ai/dishbrain-sentience</link>
    <guid>https://garden.azl.au/ai/dishbrain-sentience</guid>
    <description><![CDATA[ A quote from Kagan et al. on the two processes needed for sentient behaviour. ]]></description>
    <pubDate>Thu, 23 Apr 2026 23:00:00 GMT</pubDate>
  </item><item>
    <title>Attention Kernels</title>
    <link>https://garden.azl.au/ai/attention-kernels</link>
    <guid>https://garden.azl.au/ai/attention-kernels</guid>
    <description><![CDATA[ What "attention kernel" means, why the naive version is slow, and how FlashAttention fixes it. ]]></description>
    <pubDate>Thu, 23 Apr 2026 22:00:00 GMT</pubDate>
  </item><item>
    <title>Attention</title>
    <link>https://garden.azl.au/ai/attention</link>
    <guid>https://garden.azl.au/ai/attention</guid>
    <description><![CDATA[ Softmax takes a vector of attention scores and turns it into a probability distribution. ]]></description>
    <pubDate>Thu, 23 Apr 2026 22:00:00 GMT</pubDate>
  </item><item>
    <title>Active Inference and Free Energy</title>
    <link>https://garden.azl.au/ai/active-inference-free-energy</link>
    <guid>https://garden.azl.au/ai/active-inference-free-energy</guid>
    <description><![CDATA[ A handful of terms that keep coming up in AI safety research. Some are new to me, some aren’t — I’m writing them down anyway. Repetition is helpful. ]]></description>
    <pubDate>Wed, 22 Apr 2026 20:00:00 GMT</pubDate>
  </item><item>
    <title>Encoders and Decoders in LLMs</title>
    <link>https://garden.azl.au/ai/encoders-and-decoders</link>
    <guid>https://garden.azl.au/ai/encoders-and-decoders</guid>
    <description><![CDATA[ Basics again. I think I must have heard it so many times: Claude, ChatGPT, and co are decoder-only models. Cool. ]]></description>
    <pubDate>Wed, 22 Apr 2026 09:00:00 GMT</pubDate>
  </item><item>
    <title>Lab Notebooks and Digital Gardens</title>
    <link>https://garden.azl.au/</link>
    <guid>https://garden.azl.au/</guid>
    <description><![CDATA[ I’ve been a keen creator of lab notebooks. It helps you record your thoughts as you try different things. ]]></description>
    <pubDate>Tue, 21 Apr 2026 23:00:00 GMT</pubDate>
  </item>
    </channel>
  </rss>