Discussion about this post

User's avatar
Neural Foundry's avatar

This dimensionality consept is crucial for understanding neural network operations. The fact that vM gives a 1x5 result (row vector) while Mv gives a 5x1 result (column vector) really highlits how matrix algebra enforces consistent shapes through computation. It reminds me how attention mechanisms in transformers rely heavily on getting these multiplicatons right. Woudl be intresting to see this extended to batched operations too.

Expand full comment
Orith Loeb's avatar

It is interesting to see the CPU batching behavior!!

Expand full comment
1 more comment...

No posts