ddelalamo

@ddelalamo@mstdn.science

Protein engineering & synthetic biochemistry

This profile is from a federated server and may be incomplete. Browse more on the original instance.

ddelalamo, to random

“xTrimoPGLM: Unified 100B-Scale Pre-trained Transformer for Deciphering the Language of Protein”

A SOTA antibody structure-prediction method is tucked away in the results section

https://www.biorxiv.org/content/10.1101/2023.07.05.547496v1

ddelalamo,

@lucas_nivon While I don't want to understate the authors' achievements, my hot take is that this paper is impressive in scale but not substance. Unlike the original 2020 ESM paper, we did not learn anything fundamentally new about emergent capabilities of PLMs of this size. The only thing I found truly novel was the drastic reduction of the Ab folding model from 48 evoformer layers in AF2 to 1 layer here. But neither that result, nor its implications, was explored in detail or discussed.

ddelalamo, to strucbio

"ColabDock: inverting AlphaFold structure prediction model for protein-protein docking with experimental restraints"

Use experimental restraints as losses & back-propagating to input sequences. I wonder how many sequence changes are introduced to fit the data?

https://www.biorxiv.org/content/10.1101/2023.07.04.547599v1

@strucbio

  • All
  • Subscribed
  • Moderated
  • Favorites
  • JUstTest
  • thenastyranch
  • magazineikmin
  • ethstaker
  • khanakhh
  • rosin
  • Youngstown
  • everett
  • slotface
  • ngwrru68w68
  • mdbf
  • GTA5RPClips
  • kavyap
  • DreamBathrooms
  • provamag3
  • cisconetworking
  • cubers
  • Leos
  • InstantRegret
  • Durango
  • tacticalgear
  • tester
  • osvaldo12
  • normalnudes
  • anitta
  • modclub
  • megavids
  • lostlight
  • All magazines