rapechildren,
flop_leash_973,

Haha, it amuses me to no end that ever since I watched a “Down the Rabbit Hole” video on Youtube about TempleOS a few years back I have seen it crop up in varies places from time to time as I don’t remember ever seeing anything about it before.

Makes me wonder if it was always there and I just didn’t notice it until I was familiar with it.

sherlockholmez,

“Baader-Meinhof phenomenon” or “frequency illusion.”

original2,

google trends shows a steady increase in popularity

Dragster39,

There’s a name for this phenomenon which I will notice all around me once I remember the name of said phenomenon.

sherlockholmez,

“Baader-Meinhof phenomenon” or “frequency illusion.”

aBundleOfFerrets,

I think in this specific scenario you can attribute most of it’s popularity directly to that video

KillingTimeItself,

ah yes, a fireship viewer.

cupcakezealot,
@cupcakezealot@lemmy.blahaj.zone avatar

all the cool people are just waiting for aos to come back.

Resol,
@Resol@lemmy.world avatar

Get an Intel powered Mac and install Linux on it, just to mess with Apple.

(Note: only Windows and macOS are officially supported on these kinds of computers, and on the new Apple silicon Macs, only the latter)

justme,

I can be as rich as god and wouldn’t go for windows or apple. I would rather invest the money in good Foss development

utubas,

That’s fantastic, man! Congratulations, really!

Honytawk,

Probably the reason as to why you aren’t rich in the first place

ulterno,
@ulterno@lemmy.kde.social avatar

I can objectively state the contrary.
Mac don’t cause riches to customer.
Riches cause Mac customer.

justme,

Yes, my idealism is one of the reasons, another is severe ADHD.

chatokun,

Being a support person, if I was rich enough to frivolously buy systems, I’d have at least one of each as a reference system. Yes, I know, vms, but that’s for saving money/space. Especially MAC I’d have some hardware too. Definitely not a main system though. I currently have a broken Mac and a cheap chromebook for that reason, though due to being broken the Mac is rather useless now. When it worked I often used it to help test/troubleshooting customer stuff.

fruitycoder,

For real I’ve put in a fraction supporting OpenSource then I had if I had to buy unfree software and I’ve gotten way more out of it.

I see it as a difference between owning and renting.

kamen,

Where’s Hannah Montana Linux?

kurcatovium,

This meme was obviously created inside of it.

pewgar_seemsimandroid,

please extend this ALOT (with commonly known distros like mint and rocky and freeBSD)

rsuri,

Do you have multiple monitors?
Yes - Don’t buy a mac
No - Still don’t buy a mac

acockworkorange,

I mean, yeah, don’t ever buy a Mac, but what’s up with the multiple monitors? Do they struggle with it?

efstajas, (edited )

macOS out of the box fucking sucks for monitor scaling with third party monitors. It’s honestly laughable for a modern OS. You can install some third party software that fixes it completely, but it really shouldn’t be necessary. I use an (admittedly pretty strange) LG DualUp monitor as a secondary, and out of the box macOS can only make everything either extremely tiny, extremely large, or blurry.

Other than that, I’ve had no problems at all, and the window scaling between different DPI monitors is a lot smoother than it was with Windows previously.

rsuri,

For me it’s that compared to windows and linux, handling multiple windows between screens is always problematic, and is made worse by alt-tab bringing up all the windows for an application, which means they pop up in the other monitors too which isn’t usually what I want. Maximizing is usually not as straightforward as one would hope, and the dock moves to any window if you leave your pointer at the bottom which can get annoying fast. As some point out apparently there’s 3rd party software that allows you to fix these issues, but that’s not an option for me because I use a locked-down Mac for work and can’t install 3rd party software, so I’m stuck with the annoying base behavior.

KoalaUnknown, (edited )

The base model chips only supports 2 monitors. The Pro, Max, and Ultra chips all support multiple monitors.

dditty, (edited )

The base model chips only support 1 monitor.

Apple artificially limits the base model chips to only support 1 monitor FTFY

EDIT: revised statement based off my learning about frame-buffers below:

Apple intentionally builds base-level MacBooks without adequate frame-buffers to force users to buy upgraded and more expensive products.

areyouevenreal,

They all support two monitors (one internal and one external for macbooks, and two external for desktops). It’s not an artificial restriction. Each additional monitor needs a framebuffer. That’s an actual circuit that needs to be present in the chip.

Honytawk,

So they cheaped out on what is supposed to be a premium brand, gotcha

becausechemistry,

What percentage of people who buy the least expensive MacBook do you think are going to hook it up to more than two displays? Or should they add more display controllers that won’t ever be used and charge more for them? I feel like either way people who would never buy one will complain on behalf of people who are fine with them.

Zangoose,

The least expensive MacBook is still $1000, closer to $1500 if you spec it with reasonable storage/ram. It really isn’t that much of a stretch to add $100-300 for a 1080/1440p monitor or two at a desk.

areyouevenreal,

Not necessarily. The base machines aren’t that expensive, and this chip is also used in iPads. They support high resolution HDR output. The higher the number of monitors, resolution, bit depth, and refresh rate the more bandwidth is required for display output and the more complex and expensive the framebuffers are. Another system might support 3 or 4 monitors, but not support 5K output like the MacBooks do. I’ve seen Intel systems that struggled to even do a single 4K 60 FPS until I added another ram stick to make it dual channel. Apple do 5K output. Like sure they might technically support more monitors in theory, but in practice you will run into limitations if those monitors require too much bandwidth.

Oh yeah and these systems also need to share bandwidth between the framebuffers, CPU, and GPU. It’s no wonder they didn’t put 3 or more very high resolution buffers into the lower end chips which have less bandwidth than the higher end ones. Even if it did work the performance impacts probably aren’t worth it for a small number of users.

dditty,

TIL, thanks! 🌝

I use a Plugable docking station with DisplayLink with a base-level M1 MacBook Air and it handles multiple (3x 1080p) displays perfectly. My (limited) understanding is that they do that just using a driver. So at a basic level, couldn’t Apple include driver support for multiple monitors natively, seeing as it has adequate bandwidth in practice?

areyouevenreal,

Sigh. It’s not just a fricking driver. It’s an entire framebuffer you plug into a USB or Thunderbolt port. That’s why they are more expensive, and why they even need a driver.

A 1080p monitor has one quarter of the pixels of a 4K monitor. The necessary bandwidth increases with the pixels required. Apple chooses instead to use the bandwidth they have to support 2 5K and 6K monitors, instead of supporting say 8 or 10 1080p monitors. That’s a design decision that they probably thought made sense for the product they wanted to produce. Honestly I agree with them for the most part. Most people don’t run 8 monitors, very few have even 3, and those that do can just buy the higher end model or get an adapter like you did. If you are the kind of person to use 3 monitors you probably also want the extra performance.

dditty, (edited )

Thank you for taking the time to reply, and for further sharing your expertise to our conversation! I understand different resolutions, that the docking station has its own chipset, and why the Plugable is more expensive than other docking stations as a result. I now have a more nuanced understanding of frame-buffers and how DisplayLink interfaces with an OS like MacOS.

Allow me to clarify the point I tried to make (and admittedly, I didn’t do a good job of expressing it previously). Rather than focusing on the technical specs, I had intended to have a more general conversation about design decisions and Apple’s philosophy. They know that consumers will want to hook up a base tier MacBook Air to two external displays, and intentionally chose not to build-in an additional frame-buffer to force users to spend more. I sincerely doubt there’s any cost-saving for the customer because Apple doesn’t include that out of the box.

Apple’s philosophy has always been that they know what’s best for their users. If a 2020 M1 MacBook Air supports both the internal 2K display and a single external 6K display, that suggests to me it should have the horsepower to drive two external 1080p displays (that’s just a feeling I have, not a known fact). And I’ll acknowledge that Apple has improved this limitation for the newer MBAs, which allow you to disable the built-in display and use two external displays.

My broader point is that Apple “knows what’s best” for their users: they want customers to buy an Apple display rather than to just stick with the 1080p LCDs they already own, because they’re not Retina®. Which do you honestly think is a more common use-case for a MacBook Air user: wanting to connect to two monitors (home office, University classroom system, numerous board room settings I’ve worked in, etc), or to connect their $1200 MBA to a $1600-$2300+ Studio Display? For that, anyone with an iota of common sense would be using a MBP etc since they’re likely a creative professional who would want the additional compute and graphics power for photo/video-editing, etc.

I don’t disagree with your explanation of the thought-process behind why Apple may have made this hardware decision for MBAs, but it is effectively an arbitrary, non cost-saving decision that will certainly impede customers who expect two displays to just work, since they can do that on their 10-year-old Toshiba Satellite or w/e.

Thanks, and have a great day

areyouevenreal,

It’s not just about Retina displays. High res and HDR isn’t uncommon anymore. Pretty much all new TVs anybody would want to buy will be 4K. It has to support the Apple 5K display anyway because that’s one of their products.

As we’ve discussed two external displays are supported on the new macbook base models. It was a bit of an oversight on the original sure, but that’s been fixed now.

Also the same SoCs is used in iPads. It’s not mac only. I can’t imagine wanting three displays on an ipad.

becausechemistry,

I have a Mac with multiple monitors. It handles them a hell of a lot better than my PC at work.

lud,

I dont think it’s even possible to use more than two monitors on a M series computer (maybe except if you spend extra for the max edition)

KoalaUnknown,

That is only the case on the base model chips. The Pro, Max, and Ultra chips all support multiple monitors.

lud,

Yeah, it’s a ridiculous limitation.

KoalaUnknown,

deleted_by_author

  • Loading...
  • lud,

    It’s still ridiculous to limit it.

    Pretty much any modern computer should be able to output to more monitors than that.

    becausechemistry,

    limit it

    There isn’t some software limitation here. It’s more that they only put two display controllers in the base level M-series chips. The vast, vast majority of users will have at most two displays. Putting more display controllers would add (minimal, but real) cost and complexity that most people won’t benefit from at all.

    On the current gen base level chips, you can have one external display plus the onboard one, or close the laptop and have two externals. Seems like plenty to me for the cheapest option.

    lud,

    If true they are some pretty shitty chips.

    Having two external monitors + the built it minor is extremely common.

    At work almost everyone has at least two monitors because anything less sucks (a few use just a big external one plus the built in) and it’s also common to also use the built in monitor for stuff like slack or teams.

    Having more than two monitors isn’t a “pro” feature. It’s the norm nowadays.

    Sure it might be enough for the cheapest option if the cheapest option was cheap. Unfortunately they are absolutely not cheap, and are in fact fairly expensive.

    becausechemistry,

    At work, my work PC laptop drives two 1080p monitors. I don’t keep it open to use the onboard one because Windows is so terrible at handling displays of different sizes, and the fans run so much when driving three displays that I think it could take off my desk. So I know what you’re talking about.

    But. Have you ever used a Mac with two displays? A current-gen MacBook Air will drive a 6K@60Hz and a 5K@60Hz display when closed, and it’ll do it silently. Or both displays at “only” 4K if you want to crank the refrsh rate to over 100Hz. You think that’s not enough for the least expensive laptop they sell?

    I’m really tired of people who don’t know what they’re capable of telling me why I shouldn’t enjoy using my computer.

    areyouevenreal,

    Yeah people don’t get that they are trading output quantity for output quality. You can’t have both at the same time on lower end hardware. Maybe you could support both separately, but that’s going to be more complex. Higher end hardware? Sure do whatever.

    areyouevenreal,

    Not really. There is a compromise between output resolution, refresh rate, bit depth (think HDR), number of displays, and the overall system performance. Another computer might technically have more monitor output, but they probably sacrificed something to get there like resolution, HDR, power consumption or cost. Apple is doing 5K output with HDR on their lowest end chips. Think about that for a minute.

    A lot of people like to blame AMD for high ideal power usage when they are running multi-monitor setups with different refresh rates and resolutions. Likewise I have seen Intel systems struggle to run a single 4K monitor because they were in single channel mode. Apple probably wanted to avoid those issues on their lower end chips which have much less bandwidth to play with.

    lud,

    There is no reason that they couldn’t do 3 1080p monitors or more especially when the newer generation chips are supposedly so much faster than the generation before it.

    areyouevenreal,

    Well yeah, no shit Sherlock. They could have done that in the first generation. It takes four 1080p monitors to equal the resolution of one 4K monitor. Apple though doesn’t have a good enough reason to support many low res monitors. That’s not their typical consumer base, who mostly use retina displays or other high res displays. Apple only sells high res displays. The display in the actual laptops is way above 1080p. In other words they chose quality over quantity as a design decision.

    lud,

    1080p is perfect for getting actual work done though.

    And there is not reason why they couldn’t allow you to have multiple normal res monitors. It’s not a limitation to get you to overspend on a more expensive computer.

    FiniteBanjo,

    If you have money you still shouldn’t buy a mac.

    cordlesslamp,

    Let say I want to try Linux but I want to keep my Windows OS intact (for now), and I only have 1 SSD in my PC.

    Is there a solution that I can just partition the drive, install Linux, switch between OS by just restarting without affecting the other, AND later on remove one OS without wiping the SSD?

    lengau,

    One of the most important things to recognise before I start: Don’t try to make something permanent right now. None of this needs to be written in stone. Choose what’s going to be best for you right now and know that in a few weeks or months you might want to change it. With that in mind:

    What do you want out of Linux right now? A development system? Are you looking to see what it would be like to move away from Windows? Something else?

    Let’s start with the development system. Let’s say you’re comfortable on Windows and just want to do a few things that are easier or more convenient on Linux. In that case, you probably want Windows Subsystem for Linux. This will get you a bunch of things, including the ability to quickly and easily try out a bunch of distributions. Of course, this is going to be primarily a command line experience. You’re not going to get the “full experience” with a desktop environment, etc. But if you just “need Linux for a couple of things,” this is a great intro.

    Next, let’s say you want to try Linux out, see what the desktop is like, etc. This is a great opportunity to try a virtual machine. You’ll have limitations (less hardware access, maybe not as smooth a desktop as if it were on the hardware directly), but it’s a great way to play with distributions, especially if you want to explore multiple distros. (I’ll get to distros below)

    Got a distro you want and want to try it as your “main environment” for a while? Other folks have mentioned how to dual boot. Here, the most critical part in my opinion is to put your important data onto a third partition that’s easily accessible to both. On Linux, I’d suggest bind mounting directories from that partition in your home directory. If you want to wipe an OS later it’ll be a bit rough, but you can do it. You’ll just need to boot from a live USB to do it, and of course be very careful about what partitions you delete.

    Now, for distros:

    Everyone is going to recommend their pet distro, and to that end I recommend [REDACTED]. But! Here’s my actual guide for selecting a distro:

    1. Got a friend who’s willing to spend a decent amount of time helping you? Go with whatever they suggest, at least for now. It’s okay if it’s not where you’ll be eventually. What they’re familiar with right now will speed up their ability to help you, which will speed up your learning. What they use may well not be where you end up and that’s okay. I do however have two exceptions to this: first, if they suggest Gentoo or NixOS as your intro distro, find someone else. Gentoo and NixOS are both fantastic, but they are very much not beginner distros. In 6 months or a year though, they might be something you want to play with if you’re interested in doing a deep dive into Linux. Second, have them with you while you’re doing the install. You want to be doing the install, but they should be there to guide you and answer questions.
    2. Doing this on your own? Go with a beginner friendly distro. The main recommendations I have here are Ubuntu spins or Fedora spins. There may well be people who reply to my comment spewing hate about one or both of those recommendations, and while there’s controversy about both of these, at the end of the day they’re both great. (Conflict of interest declaration: I work for the company that makes one of those distros, and the other one is some of our biggest competition. I applied for this job in part because I thought that one of the things the community loves to hate about one of these was Great, Actually™, but I wanted to improve some of the things that I think are actually valid criticisms.)

    If internet randoms tell you “X is garbage, don’t use it,” feel free to disregard them. Most Linux distros are great. They all have smart, dedicated people working on them, and they each have their own vision of how they want it done. These ideas conflict sometimes, but that’s okay.

    And one final thing… Don’t fight against your distro’s way of doing something. At least not now. Most people telling you to do something that works against the distro are doing so for ideological, not practical, reasons. You don’t need to get involved in ideological wars - enjoy Linux for its positives.!

    Cethin,

    This comment is good, but it’s very much the “scared of change” comment. It recommends the smallest amount of change possible, which might be good for some people but just diving in will probably be a better introduction.

    You don’t learn how to swim by sitting in a bath tub. You have to get into the water. Maybe wear some safety gear (dual boot or other options), but if you’re reasonably confident and/or competent you’ll be fine getting into Linux as long as you’re using one of the major distros.

    I assume almost everyone who has made it to Lemmy is competent enough with a computer to handle the transition to Linux. It really isn’t all that hard if you know how to use a search engine.

    AnUnusualRelic,
    @AnUnusualRelic@lemmy.world avatar

    Those are wise words.

    Remember that in the end, all the distributions end up doing and installing pretty much the same thing (from the user’s pov). It doesn’t matter all that much what you use. Most of the major ones work just fine.

    bitfucker,

    Yes, first you need to resize the partition to accommodate the new OS. Usually 40-60 GB is good enough for minimal linux installation if you didn’t do any gaming or other massive applications. The resizing can be done in windows using disk management utility baked into windows, or some other partition manager (easeus, magic tools, etc). After that, linux can be safely installed in the free space as a single partition.

    Now, sometimes the bootloader is fucked, but it is quite easy to fix. In fact, if you use grub, it usually runs os-probe for you to check for any other OS. So sometimes, fixing it is as simple as rerunning grubmkconfig. But there are other times where it is not as simple. It will vary depending on what happened and too long to list here. Arch Wiki usually covers a lot of the topic so you could try searching there, especially on the topic of boot sequence.

    Lastly, if you need to move the partition, the data already inside will need to be moved too. This can take time depending on the size. But it is doable and safe.

    If, later down the road you want to remove either OS, you can simply remove the partition after moving the data first. Linux can mount ntfs natively so no problem there. On windows, there is a program called ext4 explorer or something along the line to browse and copy from linux filesystem (which is usually ext4). Don’t forget to remove the boot information too after you’re done removing the partition.

    Now there is also the other suggestion to use a live environment but I didn’t suggest it since the experience can be lacking and more hassle in and of itself.

    cordlesslamp,

    Thanks, but on second thought I don’t want to risk anything as I’m not quite the “technical” kind. I don’t even know how to dual boot 2 different windows version. I don’t think I’ll be able to fix it if anything broke.

    So I’ll buy another cheap SSD and put Linux on it while unplugged my old SSD. Then I’ll be choosing the boot drive during POST.

    I’m damn sick of Windows BS, I hope this’ll work out.

    bitfucker,

    Yeah, that’s fair. But I will still recommend anyone trying out linux AND having a problem to consult Arch Wiki when they can. It is amazing what they have there. It will also increase your technical understanding of how your system works overtime. But if you don’t have any problems when driving linux, that is good too. It just means linux for the masses is coming closer.

    For some distro recommendations, if you love to tinker, I’d say go arch. You will learn a lot about your computer too, and it is also how I learn about mine and get the know how for a lot of things now. But also, if you don’t have the time to tinker, I’d recommend bazzite. I’ve read their documentation and came to the conclusion that if anything goes wrong, it would be easy to recover from it, has great community, and is based on a solid distro.

    Cethin,

    I want to add to this that Windows sometimes has its own ideas and decides it owns the disk. I had a dual boot with Windows and Linux and Windows updated and fucked up the file system. I was able to recover almost everything without that much issue, that it did require some extra tools and some knowledge. The boot partition I never recovered though. (I was able to fix it to get it to boot into the Linux install again, but not Windows no matter what I tried.)

    This was about a year ago, maybe a bit more. The issue I had with Linux prior to this, which is why I was dual booting, was gaming. At this point gaming was perfectly fine for me to ditch windows, so I just grabbed all the files I needed to keep and set the drive up new with a fresh install.

    lud,

    In general dual booting windows and Linux on the same disk is risky.

    celeste,

    Yes, other options to try linux while keeping windows are windows subsystem linux (wsl) or booting live from a usb

    cordlesslamp,

    Thanks, what I want to try out is the gaming capabilities. I don’t know if VM or live USB can do that reliably.

    I heard that AMD GPUs is better with Linux, right?

    Cethin,

    Both of those will have worse performance, but I don’t see why they wouldn’t work. Just whenever it needs to grab more data it’ll have to go to the USB to get it, which is slow. You could load the game that’s stored on the disk already (this will require more effort and knowledge than installing Steam and it installing it locally on your Linux drive), so that’d be better, but the system data will be slow. If you have a lot of RAM it’ll reduce how often data is grabbed, so it’ll reduce the issues after boot.

    Land_Strider,

    I’ve been trying out Mint (Cinnamon) for some months now. I have an AMD Ryzen 5 3600 CPU and an AMD Radeon 6700XT graphics card, both of which work splendidly on Mint out of the box. This installation is my first ever attempt at using Linux, with dual booting on top of it (on the same sdd with partitioning), but I’d say it set up more nicely than any Windows formatting I’ve ever done over the years. Writing the .iso file to a USB drive was a bit different than I’m used to using Rufus for Windows, but Rufus can write it.

    Mint (Cinnamon) is based on Ubuntu, which itself is a massively changed Debian but with still a good compatibility with it on the surface.

    While Arch is great and all, if you are looking for a life-line after years of being a Windows user but finally deciding to not move on to the next Windows version because of all the shit they keep breaking and all the other ad and data mining they do on those versions, Mint is a great starting distro. It gets installed with all the hardware drivers present, for AMD hardware at least but Nvidia should work, too. No need to set up a modern working computer environment with requirement to install anything to get your things working. As long as OS installation goes correctly and it boots up, you are good to go.

    As for regular stuff:

    1. Libre Office is pre installed, and I find it pretty good even tho I had quite the dislike for it before. Select a theme and a layout preset for the toolbar, you are right in your element as if you are continuing to use MS Office.
    2. Gaming with Steam is just turning on one setting in Steam settings, the compatibility tab (Proton), and that’s it. Most games work out of the box. For others, check ProtonDB for what people say about the game. They usually work, or there is a little basic fiddling required at best. I can play Hunt: Showdown with Easy Anti Cheat without a hassle on it. Just another little Proton file installed, that’s all.
    3. For Windows-only programs, you can use Wine. Wine works in the background, and when properly installed, it allows you to just double click any .exes and run them. Programs can be a bit slower than using them on Windows, but most of them work on Linux with Wine if it is what matters to switch from Windows. You can play a lot of non-Steam games through that, too.
    4. Mint has a Microsoft Store-like program repository where you can install programs and their dependencies with one click. This works well most of the time, but sometimes Flatpak versions of these can be problematic. I’ve had Steam, Discord and Wine installed through it, and they had problems to some extent. For these, I switched to grabbing .deb installation files through their own websites, or in the case of Wine, installed through its own instructions on its website using a few terminal commands, which isn’t more complicated than using Registry editor or Group editor in Windows.
    5. Most other common stuff has good alternatives, with downsides or upsides. Switching from MPC to VLC, from Photoshop to Gimp, MS Office to Libre Office, etc. The internet forums have many detailed answers to these, or you can always ask for thoughts yourself. There usually is an alternative most of the time.

    One thing to keep in mind: As Mint Cinnamon is based on Ubuntu, you can use answers for Ubuntu most of the time. However, while using the answers, keep these in mind as a form of cheatsheet when troubleshooting, or looking for implementing things:

    Mint (Cinnamon) v21 and above are based on Ubuntu 22.04 LTS called Jammy, not Ubuntu 20.04 LTS called Focal(?). Almost all answers for 22.04 LTS will work on Mint Cinnamon, and all repositories and programs for it will work on Mint, too. 20.04 LTS, or recent 24.04 LTS, will have compatibility when looking for answers, but they are not directly what you are using.

    Mint Cinnamon also uses Gnome, not KDE, as the desktop environment, so keep that in mind when looking for answers. It also uses X11 of Xorg by default for its base graphics drawing, not Wayland.

    smnwcj,

    I'll add that id highly recommend making a backup before doing anything. You can more safely try out linux in a virtual machine as well

    mexicancartel,

    Yes I did this less than a week ago.

    I shrinked the windows main partition(the C: drive) to like only 70gb since i don’t want to use it at all, then made a live usb and go with custom partition selection. Then you have to give certain partitions for linux. The /boot/efi should be selected as windows boot partition so both show up in bootloader.

    Then you have to create a root and swap partition atleast, and you can have seperate home partition if you want tp install different linux distro without losing data in first linux.

    Zink,

    I just installed Linux Mint into a dual boot setup recently. Unsurprisingly, their install process made it pretty easy to partition the drive and have everything play nice together.

    bradboimler,
    @bradboimler@startrek.website avatar

    Hey in have a Mac and can confirm I am poor 🤣

    pewgar_seemsimandroid,

    Hackintosh

    bradboimler,
    @bradboimler@startrek.website avatar

    No M2 Mac I didn’t buy was a present lol

    pewgar_seemsimandroid,

    ok

    Blisterexe,

    How did you get a macbook as a gift?

    bradboimler,
    @bradboimler@startrek.website avatar

    It was a bonus my work did and I won it in a raffle.

    Blisterexe,

    Oh cool

    growingentropy,

    Buying a Mac didn’t help.

    bradboimler,
    @bradboimler@startrek.website avatar

    I didn’t buy it I’ve used Linux for 15 years its actually running Asahi Linux

    Moshpirit,
    @Moshpirit@lemmy.world avatar

    This one seriously lack of Hannah Montana Linux.

    ILikeBoobies,

    There is never a wrong time to choose Linux

    moon,

    Why would Apple be associated with being rich? They’re very averaged priced now.

    wonderfulvoltaire,

    It’s the storage prices. Any more than 1tb and you gotta sell a kidney.

    moon,

    That’s true with like every phone too tho

    growingentropy,

    My coworker does graphic design, and he paid $8k for his Mac laptop.

    I honestly don’t know if there’s anything I could order in a Windows/Linux laptop that would add up to $8k.

    I get great performance out of my $90 eBay laptop running Debian, by the way.

  • All
  • Subscribed
  • Moderated
  • Favorites
  • linuxmemes@lemmy.world
  • tacticalgear
  • DreamBathrooms
  • ethstaker
  • InstantRegret
  • Youngstown
  • magazineikmin
  • osvaldo12
  • slotface
  • khanakhh
  • rosin
  • kavyap
  • everett
  • thenastyranch
  • ngwrru68w68
  • megavids
  • GTA5RPClips
  • cisconetworking
  • mdbf
  • cubers
  • Durango
  • anitta
  • modclub
  • normalnudes
  • tester
  • Leos
  • provamag3
  • JUstTest
  • lostlight
  • All magazines