ddelalamo,

“xTrimoPGLM: Unified 100B-Scale Pre-trained Transformer for Deciphering the Language of Protein”

A SOTA antibody structure-prediction method is tucked away in the results section

https://www.biorxiv.org/content/10.1101/2023.07.05.547496v1

lucas_nivon,
@lucas_nivon@mas.to avatar

@ddelalamo Thanks for sharing, reviewing this now internally and curious to see commentary on it.

ddelalamo,

@lucas_nivon While I don't want to understate the authors' achievements, my hot take is that this paper is impressive in scale but not substance. Unlike the original 2020 ESM paper, we did not learn anything fundamentally new about emergent capabilities of PLMs of this size. The only thing I found truly novel was the drastic reduction of the Ab folding model from 48 evoformer layers in AF2 to 1 layer here. But neither that result, nor its implications, was explored in detail or discussed.

  • All
  • Subscribed
  • Moderated
  • Favorites
  • random
  • ngwrru68w68
  • Youngstown
  • thenastyranch
  • cubers
  • tacticalgear
  • slotface
  • everett
  • mdbf
  • rosin
  • kavyap
  • GTA5RPClips
  • modclub
  • InstantRegret
  • DreamBathrooms
  • anitta
  • magazineikmin
  • cisconetworking
  • Leos
  • osvaldo12
  • khanakhh
  • tester
  • Durango
  • provamag3
  • megavids
  • ethstaker
  • normalnudes
  • JUstTest
  • lostlight
  • All magazines