Now again #LLMs: if you do not want to train your own #ai foundation model, you can patch it with so-called #adapters. Benjamin Trim talked about their own open-source adapter micro framework: #Refiners work on top of #PyTorch and use declarative layers to patch models, context API to store state. #fosdem2024
Well, we supply software for new libraries - absolutely #freesoftware as in beer, and can even aid with some #hosting costs so there's nothing out of pocket.
For existing libraries, which might already have systems they can't easily migrate off of, we can provide #adapters so that the MoP software can provide an #API for use in federation. The current code is setup using #objectorienteddesign to do just this - each system will need an adapter written though!