Matrix multiplication and GR

I forgot to mention yesterday that I had also implemented a way to parallelize matrix multiplication. It's trivial and I'm sure a lot of people have done it but I still wanted to try it myself. And no, before you think that I used openmp or mpi, umm, nope. I used the multiprocessing library in python. It was easier that way and way more intuitive to implement. Here is an implementation, with a lot of mess around it. Tonight I need to implement other adaptive runge kutta methods. By other, I mean methods with coefficients other than the RK (4,5) method.

Coming to GR, every class blows my mind and I'm slowly coming around to liking or being able to live with the sheer number of alphabets and greek letters involved! But it's also a lot to remember or grasp. A lot of the properties are slipping my mind and I guess practice will take care of this problem. Time and effort. The only two things that matter. I wish someone had hit me in the head at the end of my first year and told me that! Or maybe someone did and I just didn't care...

Popular posts from this blog

Farewell to Enthought

Arxiv author affiliations using Python

Elementary (particle physics), my dear Watson