Programming, Software and Code

PLA Planning: Next Steps



I did a little bit more infrastructure work on Parrot-Linear-Algebra this weekend. I wasn't able to get a lot done because we spent Saturday with family, but I got some nice groundwork laid for important future developments.

I started thinking about bindings for the LAPACK library, and much of the work I did this weekend was in preparation for that work. LAPACK is particularly tricky because it's written in FORTRAN and unlike the ATLAS library I use for my BLAS implementation, there aren't any good C bindings to LAPACK around that I can find. Yes there is the CLAPACK library, but I am not a particular fan of the way that library is generated and there aren't any premade packages for CLAPACK on Ubuntu. In installing PLA on Linux, I would like users to be able to get all prerequisites from their distribution's package manager and not have to compile these other prerequisites manually. This is a good thing since some of these can be extremely tricky to compile.

I've decided that, in the foreseeable future, we will be linking to the LAPACK library directly. This does mean that we are going to need to write our own bindings and wrappers for the functions we want to call, but that's not a huge issue. Plus, I can start moving forward on something I've wanted to have for a while now: more flexible and configurable interfaces to BLAS and LAPACK. There are several implementations of each of these libraries (BLAS especially), and I want PLA to support as many of them as possible.

To get these kinds of flexible interfaces, we're going to need to support large swaths of conditionally-compiled code. This will get extremely unweildy if we try to keep all the logic in the few monolithic PMC files I've been writing, so I decided that we're going to need to upgrade our build machinery to support linking in additional C library files into the dynpmc build. This weekend I added that machinery, and started the process of factoring out some of the supporting routines from the various PMC types into those files. With common functions defined in one place, I can go through and make sure all my types are making consistent use of them. Plus, I can start writing up complicated library bindings without worrying about making my large PMC files any larger.

BLAS doesn't actually supply too many routines that the PLA matrix types need. The most important, and only one I call so far, is the GEMM routine, which performs the generalized matrix multiply/addition operation:

Y = αAB + βC

Where α and β are scalar values (FLOATVAL in NumMatrix2D and Complex in ComplexMatrix2D), and A, B, and C are matrices of compatible dimension. This is a very powerful operation, and really the only one from BLAS that I need. Other routines from BLAS are more highly optimized versions of this same routine for specific types of matrices (diagonal, triangular, etc), and operations between matrices and vectors which I really don't support yet. Because there is really only this one routine, and because it's so powerful and common, I've included it as a method directly in the PMC types themselves. I may rethink that decision later, but for now it seems like a decent fit where it is.

LAPACK is a little bit different, however. LAPACK contains hundreds of routines, all of which are named in an idiosyncratic and difficult-to-understand way. These routines are broken up into drivers, computational routines, and auxiliary routines. The driver routines perform the most important operations and typically call sequences of the lower-level computational and auxiliary routines to perform their work. Driver routines include calculating linear least squares, finding eigenvalues and eigenvectors, and singular value decomposition.These are the ones that PLA is going to target first, though access to other, lower-level routines might be nice to have as well, later down the road.

I am torn as to how I want to provide LAPACK bindings in Parrot. The obvious thing I could do is write them all up as methods on the PLA matrix types. This is a system that provides the easiest access, but also starts to get bloated and unweildy very quickly. Plus, if you don't need LAPACK routines for your program, I would be forcing you to load the LAPACK library and create Sub PMCs for all those methods. That can get to be a pretty large cost to pay. I really don't like this idea because it adds huge bloat to my PMC files and adds extra runtime overhead of adding in all sorts of extra method PMCs when we load PLA. Plus, it doesn't scale well, any time we want to add a new LAPACK-based routine we need to add a new method to several PMC types present and future.

One alternative is to provide a separate library of functions and dynamically load it using Parrot's NCI tools. I don't know what the current status of the NCI system is, but without significant improvement there, attempting to load a complex library like LAPACK, even if I provided some simplified Parrot-specific wrappers, would be a nightmare. This option does provide a little bit more flexibility because we can load in the methods that we want without having to deal with ones that we don't. Also, I don't think the distutils library that PLA is using supports creating standalone shared libraries that aren't dynpmc or dynops libraries, so I would have to write all that machinery myself. I'm not sure I really like this option either, for all these reasons.

I'm also torn about whether I want to make some prerequisites for PLA optional. Kakapo, for instance, is only used for the test suite. If you don't have Kakapo installed you can still build and use PLA, you just can't run the test suite. BLAS is currently required because it's used to implement the GEMM method but also the various multiply vtables. I suppose I could make it optional, but I think it would be a little sneaky if different builds of PLA supported a different list of VTABLEs and METHODs for core types, including for something as fundamental as multiplication. LAPACK is a little bit different, because even without it we do have a pretty solid amount of core functionality. I am strongly looking into an ability to make LAPACK an optional prerequisite, and only include bindings for it if we have the library installed. If I wrote a dynamic library wrapper for LAPACK and had the user include that file manually to get those routines, you could still use the core library.

I had also considered the idea (though I've been fighting it) of building some kind of PLALibrary singleton PMC type which would control this kind of stuff. In addition to providing information about library version and configuration options, a PLALibrary PMC could allow dynamic loading of methods, and could forcibly inject those methods into the necessary PMC types if requested. This would allow me to build everything together into a single dynamic library without forcing me to define all the methods upfront which I don't need. I may like this idea the best, though I've been resisting creating a library singleton PMC until now.

There's a lot to consider, and I probably won't make any hard decisions about future plans for the next few days. In the interim I do want to put together some proof-of-concept wrappers for some common LAPACK routines, and start playing with those. If I could get routines to calculate Eigenvalues, Eigenvectors, Inverses, and Singular Value Decompositions, I think that would be a pretty great start.

This entry was originally posted on Blogger and was automatically converted. There may be some broken links and other errors due to the conversion. Please let me know about any serious problems.