As I look back on 2024, it's been a year filled with deep dives into AI/ML and scientific computing, with sprinkles of materials science, chemistry, and physics. I coverage for categories looks like this:
2024 category coverage |
I figured it would be a good idea to write a summary of my posts for the year. I haven't done this in the past, but I think it may have some good value for me. Here are some of the highlights for the year.
Materials Science & AI Integration
The year started strong with my exploration of materials informatics tools. I worked extensively with the ASE Phonons class for phonon calculations, even creating a pickling utility to save calculation states. This theme of materials science tooling continued with my investigation of CrystaLLM, a fascinating LLM trained on crystallographic data.
A significant portion of my year was dedicated to implementing and understanding Graph Neural Networks (GNNs) for materials science. I documented my journey with Crystal Graph CNNs here, then here and provided updates on my progress at the end of May. The exploration somewhat culminated in working with MACE and LAMMPS for atomistic simulations, where I gave working solutions using Apptainer. This will remain an area of interest and focus for me.
Scientific Computing & Infrastructure
I spent considerable time improving my computational workflow and understanding. This included writing a post on GPU computing and implementing Lennard-Jones potential calculations to compare CPU vs GPU performance. I also explored tools for high-throughput computational chemistry, documenting how to democratize these capabilities. To date, this is one of my most popular posts that gets cross-linked from other sites, π.
One interesting development was my work with electronic lab notebooks, particularly examining Prof. Kitchin's ELN approach using emacs org-mode. This tied into my broader interest in scientific documentation and reproducibility.
I documented my approach to using Materials Project API to streamline data retrieval for materials research, and finally got arround to packaging and releasing an old polycrystalline tool.
AI Safety and Innovation
I kept a close eye on AI developments, particularly around use in materials science and physics. I did have a post early in the year on safety and innovation, AI safety and Karpathy's work, and later explored exciting developments like Kolmogorov-Arnold Networks (KAN), which represent a fascinating new approach to neural network architecture. Interestingly, I haven't really followed up on KANs and what progress with them has been made, π€
Learning Methods and New Compute Paradigms
I formalized my learning approach in a post, breaking down how I tackle new subjects through priming, compressing, and authoring method. This methodology proved particularly useful as I explored new territory of thermodynamic computing here, then here, and most recent here. Still have a lot to learn on this topic and need to finish going through the book by Peliti and Pigolotti.
Other Insights
As my of my day job I had to start thinking a little about the challenges of simulation methods and computing setups. This lead me to spend some amount of my personal time looking into off-lattice, on-the-fly KMC simulations and the potential of remote disk mounting for efficient data management.
Looking Forward
As the year closes, I'm particularly excited about the convergence of materials science and AI. The rapid development and number of tools coming out in this space is super exciting, just hard to keep up and determine what the competing trade-offs are. The one area that I didn't write enough about on, was self-driving labs and still a lot that I need to read on and toy around with. I really think this is going to change the way materials development works is done in the future. My guess is that in 10 years is that decision making and operational tasks will be AI lead and orchestrated but top-level guidance will be done by humans.
So blog writing in 2024 has reinforced the importance of building strong foundations while staying current with cutting-edge developments. Whether it was understanding fundamental concepts like entropy or exploring new computational tools and methods, each investigation has added to my toolkit for tackling future challenges.
I would say the year has been marked by a balance between theoretical understanding and practical implementation, from getting around to posting on basic physics concepts like the Fermi energy1 to implementing complex computational tools. This combination of theory and practice continues to be essential for meaningful progress in computational materials science. It was a good year of learning and decent in blog writing. Hoping to continue into 2025 and continue to push the boundaries of what I know and can do!
As this is my last post for 2024, I'll see you in 2025, and happy new year π₯³!
Footnotes
-
Turns out that I had already written a post on the Fermi energy, but forgot that I did so, and wrote a new post that ended up being similar but slightly different focus. ↩