Clifford algebras in Julia
In this blog post, we will explore the Grassmann.jl library written in Julia for working with Clifford algebras. First, we import the library and the GLMakie plotting library. using Grassmann using GLMakie set_theme!(theme_light()) The following code constructs a positive definite 3-dimensional TensorBundle with an additional plane at infinity: @basis S"∞+++" (⟨∞111⟩, v, v∞, v₁, v₂, v₃, v∞₁, v∞₂, v∞₃, v₁₂, v₁₃, v₂₃, v∞₁₂, v∞₁₃, v∞₂₃, v₁₂₃, v∞₁₂₃) then plot the following vector field:...
Hello, Typst
Here we test example Typst from various sources. ...
Hello, Quarto
The following are taken from official examples for now. ...
Notes on Zeon Algebra
I have just borrowed Clifford Algebras and Zeons: Geometry to Combinatorics and Beyond ( Citation: Staples, 2020 Staples, G. (2020). Clifford algebras and zeons: Geometry to combinatorics and beyond. World Scientific. ) from the library. This post should briefly walk through the contents of the book, highlight some key concepts, and provide further readings for each chapter of the book. For the complete and updated research work by George Stacey Staples, see his home page....
Studying group algebras with GAP
This post studies group algebras with GAP, focusing on a few interested groups. See My math interests in 2024: Group Algebra for context. It’s helpful to read GAP Manual and SO questions before using GAP. Installation I’m using Mac, so I ran the following commands to install and start GAP: brew install wget autoconf gmp readline wget https://github.com/gap-system/gap/releases/download/v4.13.0/gap-4.13.0.tar.gz tar xzvf gap-4.13.0.tar.gz cd gap-4.13.0 ./autogen.sh ./configure make -j4 V=1 all make check make install which gap gap -l ....
My math interests in 2024
I wish this post to be a continuously updated list of my math interests in 2024 with proper citations to literatures, as I keep wandering in the math wonderland and I don’t want to be lost in it without breadcrumbs. Some interests that have older origins will gradually moved to corresponding posts for earlier years. I also hope certain interests will be developed into research projects, and leaving only a brief summary and a link here....
Transformers: from self-attention to performance optimizations
The purpose of this post is to understand what is under the hood and the performance factors involved when fine-tuning and running local Transformer models, keeping multi-modality in mind, with an emphasis on the decoder-only transformers (e.g. GPT series). To accomplish this, we first present a brief account of the transformer architecture, including its design intuitions and the underlying mathematics, concretized by illustrative diagrams and code snippets. Then we aim to achieve a comprehensive understanding of the widely adopted performance optimizations for the original transformer architecture....