Comments

This profile is from a federated server and may be incomplete. Browse more on the original instance.

tal, to linux in [Solved] Window manager with no (or very small) minimum window size?
@tal@lemmy.today avatar

I can get tiled windows that are definitely narrower than what OP has specified on a default sway config. Just randomly threw up a bunch of windows, went down to 27 pixel wide windows. Didn’t try narrower or try splitting vertically, but that does make me wonder whether the limitation he’s hitting might be the particular application requesting a minimum window size, not the compositor. It looks like the X11 API does permit for that:

stackoverflow.com/…/how-do-you-set-a-minimum-wind…

I wouldn’t be surprised, though I haven’t looked, if a given compositor or window manager would have the ability to override that.

tal, to linux in [Solved] Window manager with no (or very small) minimum window size?
@tal@lemmy.today avatar

I haven’t tried creating tiny windows, but I would imagine that it would be pretty easy for you to just install compositors on Wayland or window managers on Xorg and test it yourself.

But, bigger question – I’m kind of curious why you’re dead-set on a bunch of tiny tiled windows to the extent of being willing to disregard other functionality of the window manager or even the windowing system. Like, what is your use case? Is this some kind of automated testing system?

tal, to gaming in When was a game's price worth it to you?
@tal@lemmy.today avatar

I am often willing to take a punt on a game that tries to do something creative and interesting.

take a punt on

scratches head

This has to be one of those cases where British English and American English mean essentially opposite things for the same phrase.

googles

Yup. Well, this goes on the list with “moot”.

Apparently in British English, this is “take a risk on doing something” and in the US it means to skip doing that thing.

dictionary.cambridge.org/dictionary/…/punt-on

to risk money by buying or supporting something, in the hope of making or winning more money

US informal

If you punt on something, you decide not to do or include it:

We punted on a motion that makes no sense.

en.wiktionary.org/wiki/punt

(Australia, Ireland, New Zealand, UK) To stake against the bank, to back a horse, to gamble or take a chance more generally

TIL. I guess it makes sense with the British English term “punter”.

tal, to technology in ‘Reddit can survive without search’: company reportedly threatens to block Google
@tal@lemmy.today avatar

I don’t think DDG runs its own indexer. It’s a frontend to other search engines.

tal, to memes in They really did.
@tal@lemmy.today avatar

House on Haunted Hill had Black Dude (Eddie) being one of the only two to make it out.

tal, to gaming in When was a game's price worth it to you?
@tal@lemmy.today avatar

Either it’s a series that I deeply love and know for certain will always put out quality games

Just about every lengthy series I’ve seen has had some lemons (which is why I really think that the practice of preordering is a terrible idea).

Zelda

www.youtube.com/watch?v=iPn3LIe2e3w

tal, to gaming in When was a game's price worth it to you?
@tal@lemmy.today avatar

It’s free and open-source (though one of the devs put a build up for $20 on Steam, which basically amounts to a donation). I’d definitely recommend it to someone who enjoys Project Zomboid and Rimworld.

tal, to gaming in When was a game's price worth it to you?
@tal@lemmy.today avatar

Welll…it depends. If you count DLC, there are games that have greatly outpaced inflation.

The Sims 4 costs nothing for the “base game”, but with all DLC – and that is still coming out – it’s presently about $1,100.

Another factor is that in many cases, the market has expanded. Like, in 1983, it wasn’t that common to see adults in the US playing video games. I am pretty sure that in a lot of countries, basically nobody was playing video games in 1983. in 2023, 40 years later, the situation is very different. The costs of making a video game are almost entirely fixed costs, separate from how many copies you sell.

So…if there is a game out that that many, many other people want to play, it’s going to sell a lot more copies.

I don’t really see the point in getting upset about a price, though – I agree with you on that. I mean, unless the game was misrepresented to you…it’s a competitive market out there. Either it’s worth it to you or it’s not, and if it’s not, then play something else. If someone is determinedly charging some very high price for a game in a genre, and a lot of people want to play that genre and it can be made profitably at a lower price, some other developer is probably going to show up sooner or later and add a competitor to the mix.

tal, to gaming in When was a game's price worth it to you?
@tal@lemmy.today avatar

Project Zomboid

I like the theme, like the ambiance, like the open world, and absolutely hate the combat in that game. Have you ever played Cataclysm: Dark Days Ahead? Same sort of setting and game, but turn-based, and significantly more-complex, and particularly since I see Rimworld on your list, I’m wondering if you might like it.

tal, (edited ) to gaming in When was a game's price worth it to you?
@tal@lemmy.today avatar

I think that most of the games that I’ve really enjoyed have been ones that tend towards the “full price” side money-wise, but which I have played for a long time, replayed a number of times, not just done a single pass. Gotten DLC on. Often modded.

Think:

  • Fallout 4
  • Oxygen Not Included
  • Caves of Qud
  • Civilization V
  • Stellaris
  • Noita
  • Kenshi
  • Nova Drift
  • Kerbal Space Program
  • Rimworld
  • Mount & Blade: Warband

The amount I’ve paid per hour of play on those is tiny.

My real constraint is the amount of time I have. I mean, I haven’t really been constrained by what it costs to play a game. I have a backlog of games that I’d be willing to play.

The waste, from a purely monetary standpoint, is overwhelmingly games that I buy and touch briefly, and don’t find myself playing at all. Frostpunk sounded neat, because I like similar genres (city-building), but I completely disliked the actual game, for example. A few Paradox games (Stellaris) I’ve really gotten into, but a number I’ve also found completely-uninteresting (Europa Universalis, say). There are apparently a number of Europeans who are extremely into the idea of their historic people taking over Europe, for example, and Paradox specializes in simulating those scenarios. I just don’t care about playing that out. Sudden Strike 4 – I’ve really enjoyed some real time tactics WW2 games, like Close Combat, but couldn’t stand the more arcade-oriented Sudden Strike 4.

If you could give me a Noita, but high resolution and with some neat new content and physics I’d happily pay $100.

I’ve played Nova Drift for about 180 hours. That game presently sells for $18. So I paid about ten cents an hour. The price of the game is a rounding error in terms of the entertainment I got from it. Paying ten times as much for a sequel or DLC comparable to the stuff in the original game would be fine as long as I were confident that I’d enjoy and play it as much as I did the original game.

Sudden Strike 4 is about $20. I played it, forcing myself back to it, made it to about an hour total. So I paid about $20 an hour, or about 200 times the rate for Nova Drift. And I didn’t enjoy that hour much.

In general, my preferred model would be for publishers to keep putting out DLC on highly-replayable games as long as people are interested in buying it: when I find something that I know I like, I want to be able to get more of it. If the Caves of Qud guy would hire more people to produce more content and just sell it as DLC, I’d be happy with that.

tal, to linux in AMD Wants To Know If You'd Like Ryzen AI Support On Linux
@tal@lemmy.today avatar

Yeah, but do either of those match the aims?

If you have a face unlock, you only rarely need to run it. It’s not like you’re constantly doing face recognition on a stream of video. You don’t have the power-consumption problem.

If you have an archive of 10000 photos, you probably don’t need to do the computation on battery power, and you probably don’t mind using a GPU to do it.

I mean, I can definitely imagine systems that constantly run facial recognition, like security cameras set up to surveil and identify large crowds of people in public areas, but:

  • I suspect that most of them want access to a big database of face data. I don’t know how many cases you have a disconnected system with a large database of face data.
  • I doubt that most of those need to be running on a battery.

The reason I mention speech recognition is because I can legitimately see a laptop user wanting to constantly be processing an incoming audio stream, like to take in voice commands (the “Alexa” model, but not wanting to send recorded snippets elsewhere).

tal, to games in Genshin Impact's new web event uses nightmare-inducing AI animation
@tal@lemmy.today avatar

You can do upscaling with AI upscalers in SD today, yeah, and it’s pretty nifty, but it’s working with a 2D model. That’s nice if you have a lot of footage of Lawrence from exactly the same angle; if you train a model on the whole video, then you can use that for upscaling individual frames.

But my point is that if you have software that’s smart enough to make use of information derived with a 3D model, then you don’t need to have that identical angle to make use of the information there.

Let’s say that you’ve got a shot of Peter O’Toole like this:

…tcm.com/…/lawrenceofarabia1962.4455.jpg?w=824

And another like this:

…vanityfair.com/…/1389793754760_lawrencethumb.jpg

Those aren’t from the same angle.

But add a 3d model to the thing, and you can use data from the close-up in the first image to scale up the second. The software can rotate the data in three dimensions, understand the relationships. If you can take time into account, you could even learn how his robe flaps in the wind or whatnot.

One would need something like this.

tal, to linux in AMD Wants To Know If You'd Like Ryzen AI Support On Linux
@tal@lemmy.today avatar

I can imagine that there would be people who do want cheap, low-power parallel compute, but speaking for myself, I’ve got no particular use for that today. Personally, if they have available resources for Linux, I’d rather that they go towards improving support for beefier systems like their GPUs, doing parallel compute on Radeons. That’s definitely an area that I’ve seen people complain about being under-resourced on the dev side.

I have no idea if it makes business sense for them, but if they can do something like a 80GB GPU (well, compute accelerator, whatever) that costs a lot less than $43k, that’d probably do more to enable the kind of thing that @fhein is talking about.

tal, (edited ) to linux in AMD Wants To Know If You'd Like Ryzen AI Support On Linux
@tal@lemmy.today avatar

if it can use all the system RAM it might provide medium-fast inference of decent models

Yeah, I get what you mean – if I can throw 128GB or 256GB of system memory and parallel compute hardware together, that’d enable use of large models, which would let you do some things that can’t currently be done other than (a) slowly, on a CPU or (b) with far-more-expensive GPU or GPU-like hardware. Like, you could run a huge model with parallel compute hardware in a middle ground for performance that doesn’t exist today.

It doesn’t really sound to me like that’s the goal, though.

tomshardware.com/…/amd-demoes-ryzen-ai-at-compute…

AMD Demoes Ryzen AI at Computex 2023

AI for the masses.

The goal for the XDNA AI engine is to execute lower-intensity AI inference workloads, like audio, photo, and video processing, at lower power than you could achieve on a CPU or GPU while delivering faster response times than online services, thus boosting performance and saving battery power.

Much of the advantage of having an inbuilt AI engine resides in power efficiency, a must in power-constrained devices like laptops, but that might not be as meaningful in an unconstrained desktop PC that can use a more powerful dedicated GPU or CPU for inference workloads – but without the battery life concerns.

I asked McAfee if those factors could impact AMD’s decision on whether or not it would bring XDNA to desktop PCs, and he responded that it will boil down to whether or not the feature delivers enough value that it would make sense to dedicate valuable die area to the engine. AMD is still evaluating the impact, particularly as Ryzen 7040 works its way into the market.

That sounds like the goal is providing low-power parallel compute capability. I’m guessing stuff like local speech recognition on laptops would be a useful local, low-power application that could take advantage of parallel compute.

The demo has it doing facial recognition, though I don’t really know where there’s a lot of demand for doing that with limited power use today.

tal, to Ukraine_UA in Russian soldiers are getting hard drugs delivered to their trenches in Ukraine to escape boredom, report says
@tal@lemmy.today avatar
  • All
  • Subscribed
  • Moderated
  • Favorites
  • anitta
  • InstantRegret
  • mdbf
  • ngwrru68w68
  • magazineikmin
  • thenastyranch
  • rosin
  • khanakhh
  • osvaldo12
  • Youngstown
  • slotface
  • Durango
  • kavyap
  • DreamBathrooms
  • JUstTest
  • tacticalgear
  • ethstaker
  • provamag3
  • cisconetworking
  • tester
  • GTA5RPClips
  • cubers
  • everett
  • modclub
  • megavids
  • normalnudes
  • Leos
  • lostlight
  • All magazines