Does AlphaZero generalize?

In my experiments using AlphaZero methodology, I taught an AI via self-play to excel in my Cosmic Tic-Tac-Toe game below. You can play it too here! The game works just like Tic-Tac-Toe, except that there is a hole brick in one of the squares where no one can place a piece in. The first player … Continue reading Does AlphaZero generalize?

Practical RAG Tips

On 16 Nov 2023, I went for Singapore's Machine Learning meetup, and the topic was on Retrieval Augmented Generation. Practical Tips There are quite practical tips being shared. Here's a summary for what to do to build your RAG database: Don't take in duplicate documents (this helps to have better variety for cosine similarity later) … Continue reading Practical RAG Tips

We don’t calculate probabilities, we predict multiple outcomes via analogy

I have been thinking about how we can possibly calculate probabilities in our minds without using the notion of probability. See Part 1 on a discussion of whether the world is fundamentally deterministic or probabilistic. TL;DR: We don't know, but the world is definitely uncertain. How can we plan and act well enough to survive … Continue reading We don’t calculate probabilities, we predict multiple outcomes via analogy

Is the world probabilistic or deterministic?

Ever since the great debate on quantum physics (and possibly even earlier), we have been asking ourselves whether the world is probabilistic - that is - is there no way to pre-determine what would happen? Einstein was adamant that "God does not play dice" and the world should fundamentally be certain. Advocates of quantum theory … Continue reading Is the world probabilistic or deterministic?

A Modern Take on “Godel, Escher, Bach: An Eternal Golden Braid”

Recently, I came across a book by Douglas Hofstadter, which talks about the fundamental concepts underpinning intelligence, while at the same time welding in ideas from three people: Godel, a mathematician who famously invented the Godel Incompleteness Theorem, Escher, an artist who melds in reality and fantasy into his paintings, Bach, a musician who composes … Continue reading A Modern Take on “Godel, Escher, Bach: An Eternal Golden Braid”

Recurrent Neural Networks (RNN) vs Markov Models, which is better for Text Generation (Part 2)

Comparing on the News Dataset This is Part 2 of comparing RNNs to Markov models. Do refer to Part 1 for more on the methods used. The notebook used for the code can be found on my Github link. https://www.youtube.com/watch?v=YykFbnVdqKE&list=PLcORGO1bTgjUrFJJr1nOnUV7zp0esSHp_&t=0s Part 2: RNNs vs Markov Models (News / Coding Dataset) In order to have a … Continue reading Recurrent Neural Networks (RNN) vs Markov Models, which is better for Text Generation (Part 2)

Recurrent Neural Networks (RNNs) vs Markov Models: Which is better for Text Generation? (Part 1)

I was interested to see how an RNN would compare to Markov models to generate text. I have read some blogs stating that a Markov model could work pretty well too, so I was curious whether it really does. So, here's my attempt to compare both kinds of models and compare their performance on text … Continue reading Recurrent Neural Networks (RNNs) vs Markov Models: Which is better for Text Generation? (Part 1)

How do babies think? (My proposed) Monte-Carlo Simulation Voting Hypothesis

Recently, I went to visit my nephew, and I realize that he has taken to calling his mother "bicycle" when the actual name is "Jessica". Then, I came to wonder, this is probably an instance of best-matching, where the baby simply speaks whatever sounds closest to the desired sound. He probably tried out various pronunciations … Continue reading How do babies think? (My proposed) Monte-Carlo Simulation Voting Hypothesis