typeswitch,
@typeswitch@gamedev.lgbt avatar

is there ever a good reason to allocate a contiguous buffer in memory that is greater than 4GB ?

asking for a friend

exa,
@exa@mastodon.online avatar

@typeswitch programmers are super lazy and in many languages regular arrays are still the only "sensible" interfaces between program parts...

this is a pretty regular occurence:

from numpy import np
a=np.read("this 100gb blob of extracted features from pics")
scale_columns(a)

exa,
@exa@mastodon.online avatar

@typeswitch (also literally anything that happens in matlab)

typeswitch,
@typeswitch@gamedev.lgbt avatar

@exa how does this work? does it really allocate 100gb array in memory and change all the values in memory?

astrid,
@astrid@fedi.astrid.tech avatar

@typeswitch @exa if your computer has 100gb of memory, yes! if not, oom

exa,
@exa@mastodon.online avatar

@astrid @typeswitch

yeah many HPCs now have dedicated bigmem nodes to handle this kind of stuff :D

(also, let's just attach 500GB of M2 swap right? 🤩 )

typeswitch,
@typeswitch@gamedev.lgbt avatar

@astrid @exa ah i see! i need to re-calibrate my sense of scale, 'cos i still think of 4GB as a lot of memory. it's not the first time i've heard of computers with >100GB memory but it's still so far from anything i've worked with.

astrid,
@astrid@fedi.astrid.tech avatar

@typeswitch @exa its really common to have servers running >100gb these days. in fact i bought used 2 servers that came with that much preinstalled for only $200 each

typeswitch,
@typeswitch@gamedev.lgbt avatar

@astrid @exa wow, ok. i just searched around and found someone selling a used server with 128gb for around 450 euros. i should see what i can get my hands on. thanks for opening my eyes to this possibility : )

exa,
@exa@mastodon.online avatar

@typeswitch @astrid
Like, I'm certainly not saying that 4GB isn't enough memory for everyone and that loading a 100GB blob is a good idea :)

exa,
@exa@mastodon.online avatar

@typeswitch yeah, IMO kinda usual style of processing stuff in the applied data sciences. Just load a matrix. I see this in bioinformatics mostly, but no illusions about other areas being much better....

Generally the accepted way of doing stuff is not to optimize until they hit some hard allocation limit, and then they optimize the data against the maximum possible allocation size because they're typically not the kind of programmer who'd know how memory allocation works.

  • All
  • Subscribed
  • Moderated
  • Favorites
  • random
  • modclub
  • DreamBathrooms
  • InstantRegret
  • magazineikmin
  • cubers
  • GTA5RPClips
  • thenastyranch
  • Youngstown
  • rosin
  • slotface
  • tacticalgear
  • ethstaker
  • kavyap
  • Durango
  • anitta
  • everett
  • Leos
  • provamag3
  • mdbf
  • ngwrru68w68
  • cisconetworking
  • tester
  • osvaldo12
  • megavids
  • khanakhh
  • normalnudes
  • JUstTest
  • lostlight
  • All magazines