Walsh Hadamard transform

It is well known at this stage that you can use the fast Walsh Hadamard transform to create fast Random Projections (RP) which leads to fast Locality Sensitive Hashing.
Anyway here is a booklet that was supposed to earn me some money during the lockdown but didn’t: https://archive.org/details/whtebook-archive
And some blogs that were never intended to earn me money and didn’t:https://ai462qqq.blogspot.com/

You can enact associative memory using the fast Walsh Hadamard transform (WHT):

  • A fixed random pattern of sign flips before the WHT gives a Random Projection (RP).
  • Looking at the signs of the output of the RP gives a Locality Sensitive Hash (LSH).
  • Using the LSH to look-up weights and using a simple adjustment mechanism gives associative memory.

https://ai462qqq.blogspot.com/2023/04/associative-memory-using-locality.html

Large scale sparse associative memory:
https://archive.org/details/sparse-am