Iโ€™ve achieved ๐—ž๐—ฎ๐—ด๐—ด๐—น๐—ฒ ๐—š๐—ฟ๐—ฎ๐—ป๐—ฑ๐—บ๐—ฎ๐˜€๐˜๐—ฒ๐—ฟ status! At present, there are only 464 of us worldwide. Iโ€™m also part of an even smaller group: ๐—ž๐—ฎ๐—ด๐—ด๐—น๐—ฒ ๐—–๐—ผ๐—ฑ๐—ฒ ๐—š๐—ฟ๐—ฎ๐—ป๐—ฑ๐—บ๐—ฎ๐˜€๐˜๐—ฒ๐—ฟ๐˜€โ€”just 84 at last count.

If youโ€™re unfamiliar, Kaggle (http://kaggle.com) is a ๐—บ๐—ฎ๐—ฐ๐—ต๐—ถ๐—ป๐—ฒ ๐—น๐—ฒ๐—ฎ๐—ฟ๐—ป๐—ถ๐—ป๐—ด ๐—ฐ๐—ผ๐—บ๐—บ๐˜‚๐—ป๐—ถ๐˜๐˜† where people share code and compete in challenges on topics ranging from diagnosing disease using sensor data to using generative AI to fix bugs on GitHub.

Code Grandmaster status is earned by sharing solutions to machine learning problems (as Python notebooks) that are voted โ€œGoldโ€ by expert-level Kaggle users.

Out of over 50,000 Kaggle users, ๐—บ๐˜† ๐—ฐ๐—ผ๐—ฑ๐—ถ๐—ป๐—ด ๐—ฐ๐—ผ๐—ป๐˜๐—ฟ๐—ถ๐—ฏ๐˜‚๐˜๐—ถ๐—ผ๐—ป๐˜€ ๐—ฐ๐˜‚๐—ฟ๐—ฟ๐—ฒ๐—ป๐˜๐—น๐˜† ๐—ฟ๐—ฎ๐—ป๐—ธ ๐Ÿฑ๐˜๐—ต ๐—ด๐—น๐—ผ๐—ฏ๐—ฎ๐—น๐—น๐˜†.

๐Ÿ‘‰ You can check out my Kaggle notebooks here: https://www.kaggle.com/richolson/code

Kaggle competitions often come with prize moneyโ€”typically around $๐Ÿญ๐Ÿฑ,๐Ÿฌ๐Ÿฌ๐Ÿฌ ๐—ณ๐—ผ๐—ฟ ๐—ณ๐—ถ๐—ฟ๐˜€๐˜ ๐—ฝ๐—น๐—ฎ๐—ฐ๐—ฒ. But the best part? There are no barriers to entry. You donโ€™t have to start from scratchโ€”you can take someone elseโ€™s code, build on it, and improve it.ย This gets to how I became a Code Grandmaster.

When a new competition drops, I get to work on a ๐—ฏ๐—ฎ๐˜€๐—ฒ๐—น๐—ถ๐—ป๐—ฒ ๐˜€๐—ผ๐—น๐˜‚๐˜๐—ถ๐—ผ๐—ป: a relatively simple approach that shows how the problem can be meaningfully solved. Every competition has a public leaderboard based on a scoring metric. A decent score means Iโ€™m on the right track.

Once I have a working solution, I go back and streamline itโ€”boiling it down to the core ideas and removing anything unnecessary. I add notes to explain parts that arenโ€™t immediately obvious, aiming to make the key concepts understandable to someone with only minimal machine learning experience.

Then I ๐—บ๐—ฎ๐—ธ๐—ฒ ๐˜๐—ต๐—ฒ ๐—ป๐—ผ๐˜๐—ฒ๐—ฏ๐—ผ๐—ผ๐—ธ ๐—ฝ๐˜‚๐—ฏ๐—น๐—ถ๐—ฐโ€”for anyone to copy and improve on.

Some Examples:

๐Ÿ–๏ธ Using a 1D CNN to decipher gestures from smartwatch-style sensor data:
https://www.kaggle.com/code/richolson/cmi-2025-1d-cnn-imu-only-baseline

๐Ÿงช Predicting physical properties of polymers from chemical structure:
https://www.kaggle.com/code/richolson/smiles-rdkit-lgbm-ftw

๐Ÿง  Using small LLMs to recognize errors in human math:
https://www.kaggle.com/code/richolson/eedi-llm-benchmark

๐Ÿ–ผ๏ธ Classifying skin lesions as cancerous or benign using an ImageNet:
https://www.kaggle.com/code/richolson/isic-2024-imagenet-train-oof-preds-public

๐Ÿ•ถ๏ธ Making LLMs misbehave with adversarial prompt engineering:
https://www.kaggle.com/code/richolson/add-it-up

Sharing most of my work isnโ€™t exactly conducive to winning competitions. People can easily tweak my public notebooks and outscore me with my own code.

So why did I author over 100 public notebooks if they hurt my chances at prize money?

Because I wasnโ€™t (usually) trying to winโ€”๐—œ ๐˜„๐—ฎ๐˜€ ๐—ฑ๐—ผ๐—ถ๐—ป๐—ด ๐—ถ๐˜ ๐˜๐—ผ ๐—น๐—ฒ๐—ฎ๐—ฟ๐—ป.

โ€œIf you canโ€™t explain it simply, you donโ€™t understand it well enough.โ€
โ€” ๐——๐—ผ๐—น๐—น๐˜† ๐—ฃ๐—ฎ๐—ฟ๐˜๐—ผ๐—ป
(Not really. But also ๐—ป๐—ผ๐˜ ๐—”๐—น๐—ฏ๐—ฒ๐—ฟ๐˜ ๐—˜๐—ถ๐—ป๐˜€๐˜๐—ฒ๐—ถ๐—ป.)

Related Posts

Leave a Reply

Your email address will not be published. Required fields are marked *