So thoroughly CEO-brained he’s sabotaging his own business. He’d rather have his serfs in spitting distance than a future for his company. Truly incredible.
Multiple people can screen share at once over Slack, plus people not sharing can use a pencil tool to highlight specifics on the screen, as in “you need to change this setting”
TPM is basically never for your benefit. It's becoming a requirement because Microsoft is going to one day say "you can only run apps installed from the Windows Store, because everything else is insecure" and lock down the software market. Valve knows this which is why they're going so hard on the Steam Deck and Linux.
TPM actually provides some useful components to isolate encryption outside of Ring 0, which is a trust win. But any technology must be weighted against its power to oppress.
And its power to make the system less secure. Isolating things outside ring 0 means malware can isolate itself outside ring 0 as well, and then it’s impossible to detect or remove without throwing out the entire machine.
Which is much, much scarier than anything an ordinary rootkit might do.
I’m sure you’ll be ok sending me your social security number, home address, bank login details, credit card number, a copy of all the files on your hard drive…
Sure, but does a grandmother’s Solitaire & Facebook PC really need quick encrypting and decrypting? Anyone not dealing with sensitive info doesn’t need one.
Sure there are. If it gets compromised with malicious code, I have no way of removing it.
I can protect ring 0. I can keep crap out of ring 0. If all else fails, I can nuke everything in ring 0 and boot a fresh OS installation. But I can’t do a single bleeping thing except throw out the whole machine if malware takes over ring -1.
This is already the case with your motherboard firmware, which fTPM is a part of. You are correct in that you have no real way to handle malware in it except throw it away. This doesn’t change in any way if you get rid of TPM.
It’s the way everything is moving. Hardware protected keys can be very useful but it’s a double edged sword. It’s more secure but also allows companies to lock consumers out.
We need rules that say when this tech is used the consumer still gets full control over it. Like what Google does with their Pixel phones and the Titan chip. Not what Apple does.
It’s only more secure until someone discovers yet another RCE bug in the firmware, and then you’ve got malware in your machine that’s impossible to detect or remove.
Like what google does? You mean disallowing people who use a privacy respecting android rom from using their banking apps and such? Soon very possibly banking websites included?
yes, the reason is to securely store cryptographic keys. even your own. It comes preloaded with microsoft ones usually, but you’re free to delete them and install your own
Sadly, I agree. I’m at the point now where as long as I’m not trying to game I can thrive on Linux. But even then I spend way more time than necessary getting things to work that do so out of the box on Windows. We have a long way to go before legacy apps is the only reason to run it.
Personally I found the time I saved from not having any control over my system has more than made up for tinkering that I have to do to get things running. My laptop would regularly become unusable for 20+ minutes on windows because of disk performance issues, and I as the user had no means to prevent windows from running the service that locked everything up. That along with other times windows just decides your use case is less important have added up to far more time then having to debug a game here and there
The people that prefer Windows for gaming are not the people that will have performance issues on an OS basis, their rig is powerful enough to run complex games, the OS based performance loss is negligible in comparison. Hell, I sometimes don’t reboot the work computer for days and it doesn’t freeze at all. The system is on an SSD and there are no hiccups nor disk performance issues. In any case, with current day prices, buying a new m2 stick and new ram is less than 100€ total, and to be honest, I’d rather pay that and be fine for 4-5 years than spend a big part of my free time trying to make witcher 3, baldur’s gate 3, path of exile, tons of steam games and league working perfectly for Linux. It’s just not worth it.
I use WSL for work because coding in a Linux environment is better but I still need access to office tools, because companies work with those tools.
Linux won the servers war, but it still has to do much to win the home/work computer war.
Ungh, yeah I used to have that problem with my laptop when I was in college.
I only booted it up for classes unless I had a test coming up I needed to study for or something. Because why the fuck would I not do that - I had a regular computer at home for everything else.
Every couple weeks, that meant it was updating instead of being available for note taking, and usually for the entire hour I needed it. Because apparently setting the updates to run during shutdown wasn’t good enough, they needed to be run on boot, because fuck you that’s why.
Linux is just… hey I should probably update this shit at some point… meh, tomorrow.
Because apparently setting the updates to run during shutdown wasn’t good enough, they needed to be run on boot, because fuck you that’s why.
Oh it also loves to install updates on shut down. So when you need to leave the class room to go home that fucking thing tells you to not cut power because it needs to install shit. Fuck you, I need to catch my bus!
Legit idgaf if you want to be plugged in for an update, if it’s inconvenient I’m unplugging it, fuck you for thinking I won’t, and it’s above 60% battery so it doesn’t matter anyway.
Maybe if my computer wasn’t buying so much avocado toast it could manage resources better.
"Long way" won't be long, because Google 2.0, err, MS' direction continues to make Win worse over time (cloudify everything, extract more data and strip more rights+control from each user, and gain more money via price-increasing subscription models) while the open source desktop ecosystem around Linux is getting noticeably better for almost every user every ~5 years or so. The era of Windows as a "pure" OS died with W7. Since W10, it's OS + integrated malware. Start of downfall.
Those things matter to you and me but we’re in the minority. As long as Johnny Gamer and Grandma Facebooker can still do their preferred activities in Windows there’s a close to zero percent chance they’ll put the effort into making the switch.
It seems unlikely Valve will ever make Windows the primary OS for their devices. And they’d lose a lot of user support if they ever required the TPM for their own software, so hopefully they wouldn’t risk it.
Why does everybody seem to think that userspace attestation is the only use for the TPM? The primary use is for data to be encrypted at rest but decrypted at boot as long as certain flags aren’t tripped. TPM is great for the security of your data if you know how to set it up.
Valve is never going to require TPM attestation to use Steam, that’s just silly. Anti-cheat companies might, but my suggestion there is to just not play games that bundle malware.
Anti-user features which are enabled by games and programs that were already anti-user before this. Hardly worth getting upset about, nothing has really changed. You already should have been avoiding them, because they were already anti-user.
TPM is basically never for your benefit. It’s becoming a requirement because Microsoft is going to one day say “you can only run apps installed from the Windows Store, because everything else is insecure” and lock down the software market. Valve knows this which is why they’re going so hard on the Steam Deck and Linux.
This is the comment I was replying to. I was simply pointing out that for a company “going hard” on SteamDeck and Linux, it’s curious that they would spend any amount of effort at all enabling the TPM to allow people to run Windows. I guess my point is I don’t think they’re “going hard” quite as much as the person I responded to thinks.
Also it was just pointing out that this specifically can affect the SteamDeck since they use an AMD processor with AMD fTPM.
I don't see how it affects the Steam Deck. It's entirely possible that the Steam Deck supports fTPM purely because it was part of the motherboard template Valve chose and it would have been more trouble to change it than to just leave it in.
They are “going hard” the way I see it. Without Valve doing legwork behind the scenes and collaborating with anticheat developers we wouldn’t even have Apex Legends running on Linux like we’ve had for a year and a half. They’ve been talking about wanting to use Linux as a viable PC gaming platform to escape Microsofts lockdown of their platform since the days of Steam Machines when Windows 8 and the new store app were giving bad signs.
Either way Valve would be silly not to provide a compatible way to use Windows on the Deck. Even though the situation is much better these days, they know very well that a lot of enthusiast PC gamers would be dismissive of the Deck if Windows couldn’t work properly on it and that word of mouth would bring less confidence in the product.
And now Imagine Linux had actually more market share on the Desktop. But for that, Linux needs at least a little more software support to be reliable for other people. And that software is usually not open source. Maybe with Flatpak, it will finally get somewhere in that regard, if there’s enough interest from people.
Most people are unable to administrate their own systems, therefore GNU/Linux–an operating system built on empowering developers and administrators–is basically unimaginable.
Microsoft and Apple have co-opted the admin duties for users, and that’s why people use their operating systems. It spares them from the disaster we all saw and experienced in the Window XP days–but that comes at a price.
It’s not software support, it’s not anythign to do with Linux. It’s a computer illiteracy problem.
Android could, in some respects, be considered linux’s biggest success story among regular users and that’s because Google co-opts admin duties.
Most people dont want an OS to be different. They are happy if it boots up and does what they want to do. It’s not lazy, it’s an active disagreement with the premise.
This is why nobody upgrades to Windows 10 from 7, or to 11 from 10. Security risks and lack of features aside, their OS just works for them.
Sorry but that’s just wrong. Enough people simply don’t even consider Linux because their needed software doesn’t work + there’s no equivalent alternative. And my PC/OS is not a hobby or a Ideology. It’s a tool that I use to work with.
Is it really wrong? Do you have numbers? I think the most people claim above is at least plausible. It surely fits my personal experience, but that is of course not worth much.
I would argue that most people use their PC for web browsing, light photo editing and personal office stuff and maybe gaming (at least outside work) and those people are not affected by “the software I need does not work and there is no alternative”.
Your first point is web browsing. Even that doesn‘t work properly on a linux desktop lol. Browser performance is abysmal because the browsers lack out of the box support for hardware acceleration. Even if you get it to work it might not work reliably and an update might break it again.
Try using a discord call and open a youtube video in 4k at the same time on a a freshly installed linux desktop. The audio will be choppy and the video will drop frames like crazy. Just moving around windows on your desktop is not nearly as smooth as it is on windows.
You seem to be very misinformed. Browsers do not lack hardware acceleration. Some distributions do not include the necessary packets in their default configuration. Some. And when you get it to work, like in Arch Linux, where almost nothing is installed by default, it works flawlessly for years, never had an update breaking browser hardware acceleration.
I can run 12 4k youtube videos at the same time and route the audio to different channels of my different audio devices AND accept several calls from different webapps and the only thing that is not smooth is your way of discussing things LOL
I‘ve had this issue on several distros and multiple friends have the same issue. Video hardware acceleration in a browser is a mess. This is definitely not only affecting me as there is a significant amount of complaints on forums and reddit.
And there is no way that the average computer user will use arch. And as long as you gotta fiddle around with your system to get even the most basic shit running smoothly like watching a high resolution youtube video and moving around windows on your other screen at the same time linux will stay irrelevant as a desktop os. It‘s still a system for nerds and I kinda feel like that this is okay.
I would argue that most people use their PC for web browsing, light photo editing
Maybe just me but I know nobody who still uses a PC for this things anyway. The vast majority of people use their smartphones or tablets for basic stuff like that. People who still use a PCs or Laptops, usually do more work than that.
Gaming is fine if you use Steam and the compatibility layer or jump through hoops, and don't play basically anything online.
The photo editing tools on Linux are dogshit.
Web browsing is fine, but not if you want to stream any content, because no one will serve you anything even medium quality without DRM.
Office stuff can kind of be replaced, but mostly by using the browser versions of the shit people actually use, because the tools to collaborate with others (particularly non-techy people) don't exist for open source alternatives.
The software available is absolutely a massive limitation.
Basically the entire multiplayer space is locked out. It's a massive compromise. And every platform that isn't Steam requires significant manual configuration and still has issues.
No, they're not good. And they're not suitable for any normal person because the UX is a dumpster fire.
Nobody with normal tv/movie content gives you comparable quality on Linux.
Yes, normal people do need to collaborate. And no, none of the office options on Linux are capable of functional collaboration for normal people, except Google/microsoft through browser nonsense.
Basically the entire multiplayer space is locked out.
Not all multiplayer games use this anti cheat techniques (and those might just be working in the near future anyway). CS:Go works perfectly, Rocket League does, Dota 2 does, LoL did at least (I don’t know what they’re up to these days), 7 days to die does, paradox grand strategy does, Mordhau does, Path of Exile does, and those are only sone of the games I personally can confirm.
And they’re not suitable for any normal person because the UX is a dumpster fire.
People who use Photoshop professionally mostly agree, that GIMP is a great app that has just a few drawbacks compared zo photoshop. The UI was a dumpster fire, but they sorted that out. Photo Editing is on par with photoshop, at least with other free plugins. If your UX sucks, maybe it’s an error on osi layer 8.
Nobody with normal tv/movie content gives you comparable quality on Linux.
I’m still running 1080p on everything and Netflix delivers 1080p to all my linux boxes. Is there a problem with 4k?
Yes, normal people do need to collaborate. And no, none of the office options on Linux are capable of functional collaboration for normal people, except Google/microsoft through browser nonsense.
Which tools on windows allow easy collaborative office projects other than microsoft or google? Well, other than cryptpad, OnlyOffice, koofr, almost every nextcloud provider, etherpad…
I am a gamer and I run into “the software doesnt do what I want, and theres no way around it/alternative” very often.
almost always cause I want to run another file in the same proton instance of a game to install a mod or do something else.
Or because something just doesnt work, despite following the instructions and others getting it to work.
Like, Cyberpunk is my most recent example. CET doesnt work, followed the guide, installed the packages the guide said to, still nothing. It doesnt prevent me from running the game, but it certainly stops me from enjoying it the way i want to.
I think it’s more that there’s a perception of things not being compatible with Linux nowadays. A lot of the games that didn’t work 5 years ago now do, and I’m still seeing people complain that games like Halo Infinite don’t work on there when they actually do.
The only things I can think of that aren’t compatible and required for some tasks are Photoshop and professional CAD/CAE software. For >90% of the population Linux should be able to handle everything they need
Realistically windows is really good at repairing itself (or just getting it to a state where its usable again, to most users would be ‘repaired’).
Until linux has some sort of system like this, its just not worth the headache to 99% of users. The linux errors aren’t even that descriptive when they happen, and could be cause by like anything.
they dont even bother to give anything else than an error code which is applicable to 482885 different roots of errors.
Indeed the repairing functionality works. but yeah. the problem will be solved. linux has moved exceptional towards usability and will continue to do so.
It will never be the year of the linux desktop, until linux is easy to use and easy to troubleshoot and fix.
and let me tell you, every minor problem requiring some kind of arcane terminal ritualism in ancient enochian that only veteran sysadmins know, is not, and will never be, easy to use or troubleshoot.
There, I just gave you 2 ways to turn that arcane terminal ritualism in ancient enochian that only veteran sysadmins know, into a plain english service manual that any literate human being can use to figure out basically any terminal application ever.
Yeah, I’ve done --help. It doesnt make it simple. and it doenst magically let you figure out how to solve the problem, assuming you even know what package is causing the problem.
I’ve gone through more than enough fixing of more than enough problems as an average, not-sysadmin person. I know how bullshit it is. Just because you are used to it doesnt make it easier for regular people to use.
Microsoft has done a lot of shit wrong, but the one thing they got right is the usability of the OS, how any idiot can be sat infront of a computer and know what they’re doing with less than a day of faffing about, and can easily fix most common problems in a few clicks.
I can’t speak for other distributions, but Pop!_OS has had a “Refresh Install” option for a while now that does exactly this. This hasn’t happened often, but there have been a couple of times when something borked my system to the point of making it no longer boot, and re-running the installer in “Refresh Install” mode got everything back and running within 30 minutes while preserving all of my non-system files; in particular this meant that I didn’t have to re-download my Steam and other locally installed games, which is significant because they are the largest apps on my system.
Fedora doesn't provide binary drivers even if they exist, you need to get a pluggable wifi usb tool that is supported and install the repositories and configure binary drivers to get wifi working on a huge amount of laptops.
Ubuntu does provide binary drivers but the configuration tool can just crash by itself a lot of the time and just fail to load the driver.
Ubuntu's desktop sometimes just crashes.
Fedora uses some strange memory compression driver to handle its paging file and this can sometimes just crash the OS entirely by itself.
These are major issues that shouldn't be issues, they should either have been fixed as a priority for the crashes or have some kind of workaround that doesn't require owning specific USBs that regular people just won't have. There's no reason for the memory compression thing either, it probably doesn't do that much for performance overall but random hard-locks are a huge negative. Linux is its own worst enemy on the desktop.
Sometimes the issues with WiFi chipsets is not the distro but the manufacturer. Debian for instance now includes non-free firmware on its installation ISO image, but some manufacturers do not allow the distribution (e.g. Broadcom) of firmware, so Debian can’t legally include them. And unfortunately the manufacturers don’t make it easy to “just download the firmware” so you can put it on the USB stick so the installer can see them. (Literally the only issue with putting Debian on my old 2013 Macbook Pro was the Broadcom firmware - but fortunately, having a Debian desktop I could install the firmware downloader there to get the two files the installer needed).
This is not a fault of the Linux distro, but a fault of the hardware manufacturer. Unfortuantely, like the smell of piss in a subway, we all have to deal with Broadcom.
Why do you need full disk encryption in your day to day life? Are you a secret agent? I feel like that would give you our though.
It’s not a matter that I would have nothing to hide, this defense is stupid. It’s a matter that you should use a security adapted to your need, because the cost doesn’t offset the benefit otherwise. And with disk encryption you will far more often be sorry than happy if you’re a normal person.
People are imperfect. People have left laptops full of personal and/or commercially sensitive data on trains or planes, had them stolen from cars and houses etc. Full disc encryption is a defence against data breaches especially for computers that are not bolted down. Or it might be as simple as a person not wanting the embarrassment of their porn stash being found.
You are only seeing what TPM is now. Not what TPM will become when it become an entire encrypted computing processor capable of executing any code while inspection is impossible.
Yes, it’s right in the name “trusted platform module”. There is no secret that their ambition is to become a space to run code outside the user’s reach and scrutiny.
They start with the most legitimate and innocuous purpose. Once it is adopted and ubiquitous it will not suffer the fate of the other attempts and rotting on the vine.
Then surprise TPM 5.0 become full scale full speed trusted execution environment and it’s too late to do anything about it. Eventually , non trusted processing capability will be phased out and only Intel and signed code will run.
I’m still on the hunt for a desktop Linux distro that has no security features or passwords. My usage for this may not be common but it can’t be rare enough that there are zero options
I agree that there should be an easy setting to at least allow updates without password. I installed Manjaro for my mom, after a while she complained “there are updates every day and I need to input the password too many times”
I’m an engineer with trade secrets on his laptop. I’ve heard of dozens of people getting laptops stolen from their cars that they left for like ten or fifteen minutes.
The chances are slims, but if it happens I’m in deep trouble whether those secrets leak of not. I’m not taking the risk. I’m encrypting my disk.
It’s not like there’s a difference in performance nowadays.
TPM’s not going to help with that situation, though, right? Either you’re typing in your encryption password on boot (in which case you don’t need TPM to keep your password), or you’re not, in which case the thief has your TPM module with the password in it.
From what I understand, TPM is “trusted” because of the fact the secrets it contains are supposed to be safe from an attacker with hardware access.
This is what makes it good at protecting data in case of a stolen laptop. This is also what makes it good at enforcing offline DRM or any kind of system where manufacturers can restrict the kind of software users can run on their hardware.
I mean, i do have some stuff that i encrypt, but encrypting the folder or packing it on a small partitiin and encrypting only this fs after booting makes more sense to me.
so you never caught a team of government officials in your living room brute forcing your bootloader at 4am as you got up to use the bathroom, huh. Lucky guy.
I tried to open the wired link and got a 404, then tried again and got a 504, then tried again and got a 503.
I then opened the lifehacker link, and it opened fine. The content of that link gives me the impression Ghostery may have had ties to ad companies. At the bottom of the article they link to Mashable as their source here:
Since the Wired article seems to be the only one I can't open, I guess it is unable to defend itself beyond the title of the article, which says that (1) Ghostery is now open source and (2) Ghostery has a new business model. Based on what I can see, it would appear to me as though Ghostery was actually owned/managed by Evidon. My interpretation of that would have to be that their OLD business model included selling information to advertisers. I tried to go to evidon.com but it was blocked by my intentional DNS poisoning (a sign that it is a scummy domain). After temporarily changing my DNS resolver to one of the servers hosted by
Which is clearly a business that is designed to help businesses monetize web services while staying just barely legal and maximize the amount of data a marketer can pull from people without getting in shit for not actually getting consent from them.
So, when you say
"It is not, and never has been, in league with ad companies."
Do you mean I have imagined all of the above? Because it sounds pretty shady to me that a company affiliated with Evidon and Crownpeak would be making a product line like the ones at Ghostery.
Sigh... now that I am home I am able to open the Wired article. The second link is to a vice.com article which says:
"Ghostery 6.0 is a from-the-ground-up re-imagining of how to design a privacy-enhancing browser extension so that its features are more easily accessible to a mainstream audience."
In other words, this is NOT their old version, and it says nothing about any previous versions, ownership, management, or financing of the product. The fourth link in the article is to another Wired article:
"Ghostery, another popular ad blocker, operates under a different model. As a user, you don't see ads and aren't tracked by pesky data trackers. The company, however, makes money by collecting anonymized data on what those trackers pick up. It repackages that data and resells it to publishers, websites, and other companies it says can use the information to help improve the speed, privacy, and performance of their sites."
Followed by a footnote that says:
"UPDATE 3:47 PM ET 03/02/16: This story has been updated to accurately reflect that Ghostery does not collect the same data that third-party trackers collect, but rather collects and sells data about the trackers themselves."
I have a hard time not seeing this as:
"Ghostery was getting a shitty reputation because people did not understand that they were selling information about stooges to other stooges. Their solution was to make a dramatic shift in their business model in hopes that they could win back privacy points."
When it comes to digital privacy, I am not big on second chances. If Meta says they are going to opensource some portion of their crap it doesn't win them any points with me and I won't be trusting them with any digital data. Whatever anyone else's opinion may be, there is plenty there to keep me from trusting Ghostery, opensource or not.
I'm also not a fan of Wikipedia [not a primary source] but even they have this:
@Xylight Firefox v2.0.20 in 2008 was the last "good" version. Since then it has been awesome bar, pocket, forced VPN ads, ftp support, PAC support, XUL bombing, and phone calls home. I use Librewolf, and while it is tolerable it is still tired to a garbage browser. :-(
In previous versions you could search your about:config on the "value" field, this is no longer possible. Searching for https:// and http:// would give you a list of numerous URLs, most of which are under Mozilla's own domains. Some might argue that things like updates are necessary to ensure a secure browser. Others might argue that they have run very outdated browsers without problems for years, and that combined with forced updates and the Maintenance Service, the log files generated produce a not-insignificant amount of information about users.
Suggesting using an decades old and known exploitable browser because “well I never got hacked” is like saying vaccines are unnecessary because “well I never got sick”
For the sake of being complete, I downloaded Firefox v2.0.20, which of course does not come in x64 flavour. So after installing assorted i386 libs, it did in fact run. The first-start popped up the image below, after which it showed a clearly broken and distorted mozilla.org page. It was also essentially unusable, since it doesn't meet the minimum TLS requirements to view most sites. Disroot SearX, ddg.gg, StartPage, ecosia.org, and even the client test page at ssllabs.com all failed to load. google.com did load, but searching for "alternative search engines" was a failure because none of the sites in the results could be opened. Now did you really think that me saying v2.0.20 is the last "good version", meant that anyone should rush off to download it and try to use it today?
@Xylight still hungry eh? Forced updates, telemetry, maintenance service, quick release schedule, ignoring system theme, and bug reports that are ignored for years and years.
@Xylight LAWL -- clearly if you think those things are imagined then you're not looking, and are not willing to. You can defend Firefox all you like, I can badmouth it all I like, and neither of us has to give a fuck what the other thinks.
I’m perfectly fine with criticism of Firefox, I use librewolf myself. You’re overblowing the issues though. If you don’t search for crap like pocket, you’ll only maybe find it once in the settings menu.
I don’t know what “Firefox” you’re talking about. None of this is an issue at all (except the anonymous telemetry that you can disable easily). Firefox has actually surpassed Google in speed in its latest update. Let’s say, you were right, does that justify using that spyware chrome? I’m willing to compromise on things so I don’t support a monopoly. We have to get out of our comfort zone a bit if we really care.
CEO: “We have observed through careful analysis that by locking our customers inside the restaurant, they will continue to order food from us in order to not starve. Therefore, from now on, all doors shall now be one way only”
This CEO is so entitled he’s completely flipped the last few centuries of capitalist economic theory on its head.
No longer is capitalism about convincing business owners or consumers to invest in your product or business. Now, the consumer is lucky to have had the opportunity to purchase your product, as though you are a king or deity; magnanimous in your product offering.
They should be so lucky, to be forced to pay you a monthly tithing for your racketeering.
“Every time a customer buys a printer, it’s an investment for us. We are investing in that customer, and if that customer doesn’t print enough or doesn’t use our supplies, it’s a bad investment.”
They literally can’t help themselves. They’ve gone from treating their employees like an investment vehicle, where if it doesn’t perform well enough, they stop investing in it, and they’re fully onto doing that to their customers as well. (They aren’t exactly actually investing in their employees either. They consider an employees low pay an “investment,” in the employee. Nevermind the employee can’t afford an apartment on their own on their pay.)
You know how little your boss thinks of you and how disposable they think you are?
Yeah, well, they think that about the customers now, too.
“You can easily be replaced with another customer who prints more,” is what they are saying to themselves.
The company I work for has a contract with HP to provide and service the printers. My department uses a printer everyday. In addition to internal use we print receipts and documents for clients who sometimes only have a few minutes to wait. We have been told that our printers are going to be removed because we don’t print enough. Our page count isn’t high enough to justify the cost from HP, despite the fact that we literally can’t do our jobs without them. The result of this is that we’ll have to walk the floor until we find an available cloud printer, no matter how far away or inconvenient it is. For corporations it’s all about the numbers. Metrics, budget, etc. How it affects their employees doesn’t matter to them.
That’s why I’ve been using YouTube without logging in and if using in browser, I have the cookies autodelete after I close the page to start new each time.
It never really recommended me what I wanted anyway. I guess the algorithm doesn’t work on me.
I have been trying to figure out why since I started using it... searches spin forever, videos spin forever, some videos just spit "Error 1003" immediately, and then they become accessible 10 minutes later. I even tried filing an issue to no avail. I may end up looking for other alternatives.
I have, but the ones I tried weren't much better. I looked again and there's a few new ones, so I'll try those and update if they work faster/more consistently.
The Apple M series is not ARM based. It's Apple's own RISC architecture. They get their performance in part from the proximity of the RAM to the GPU, yes. But not only. Contrary to ARM that has become quite bloated after decades of building upon the same instruction set (and adding new instructions to drive adoption even if that's contrary to RISC's philosophy), the M series has started anew with no technological debt. Also Apple controls both the hardware to the software, as well as the languages and frameworks used by third party developers for their platform. They therefore have 100% compatibility between their chips' instruction set, their system and third party apps. That allows them to make CPUs with excellent efficiency. Not to mention that speculative execution, a big driver of performance nowadays, works better on RISC where all the instructions have the same size.
You are right that they do not cater to power users who need a LOT of power though. But 95% of the users don't care, they want long battery life, light and silent devices. Sales of desktop PCs have been falling for more than a decade now, as have the investments made in CISC architectures. People don't want them anymore. With the growing number of manufacturers announcing their adoption of the new open-source RISC-V architecture I am curious to see what the future of Intel and AMD is. Especially with China pouring billions into building their own silicon supply chain. The next decade is going to be very interesting. :)
The whole “Apple products are great because they control both software and hardware” always made about as much sense to me as someone claiming “this product is secure because we invented our own secret encryption”.
Here’s an example for that: Apple needed to ship an x86_64 emulator for the transition, but that’s slow and thus make the new machines appear much slower than their older Intel-based ones. So, what they did was to come up with their own private instructions that an emulator needs to greatly speed up its task and added them to the chip. Now, most people don’t even know whether they run native or emulated programs, because the difference in performance is so minimal.
The Apple M series is not ARM based. It’s Apple’s own RISC architecture.
M1s through M3s run ARMv8-A instructions. They’re ARM chips.
What you might be thinking of is that Apple has an architectural license, that is, they are allowed to implement their own logic to implement the ARM instruction set, not just permission to etch existing designs into silicon. Qualcomm, NVidia, Samsung, AMD, Intel, all hold such a license. How much use they actually make of that is a different question, e.g. AMD doesn’t currently ship any ARM designs of their own I think and the platform processor that comes in every Ryzen etc. is a single “barely not a microprocessor” (Cortex A5) core straight off ARM’s design shelves, K12 never made it to the market.
You’re right about the future being RISC-V, though, ARM pretty much fucked themselves with that Qualcomm debacle. Android and android apps by and large don’t care what architecture they run on, RISC-V already pretty much ate the microcontroller market (unless you need backward compatibility for some reason, heck, there’s still new Z80s getting etched) and android devices are a real good spot to grow. Still going to take a hot while before RISC-V appears on the desktop proper, though – performance-wise server loads will be first, and sitting in front of it office thin clients will be first. Maybe, maybe, GPUs. That’d certainly be interesting, the GPU being simply vector cores with a slim insn extension for some specialised functionality.
Thanks for the clarification. I wonder if/when Microsoft is going to hop on the RISC train. They did a crap job trying themselves at a ARM version a few years back and gave up. A RISC Surface with a compatible Windows 13 and proper binary translator (like Apple did with Rosetta) would shake the PC market real good!
the mac pro is a terrible deal even compared to their own mac studio. It has the same specs but for almost $1000 extra. Yes, the cheese grater aluminum case is cool, but $1000 cool?
Also, one of these days AMD or Intel will bolt 8GB on their CPUs too, and then they’ll squash M.
I can’t remember who it is but somebody is already doing this. But it’s primarily marketed as an AI training chip. So basically only Microsoft and Google are able to buy them, even if you had the money, there isn’t any stock left.
I thing this is one of those words which has lost its meaning in the personal computer world. What are people doing with computers these days? Every single technology reviewer is, well, a reviewer - a journalist. The heaviest workload that computer will ever see is Photoshop, and 98% of the time will be spent in word processing at 200 words per minute or on a web browser. A mid-level phone from 2016 can do pretty much all of that work without skipping a beat. That’s “professional” work these days.
The heavy loads Macs are benchmarked to lift are usually video processing. Which, don’t get me wrong, is compute intensive - but modern CPU designers have recognized that they can’t lift that load in general purpose registers, so all modern chips have secondary pipelines which are essentially embedded ASICs optimized for very specific tasks. Video codecs are now, effectively, hardcoded onto the chips. Phone chips running at <3W TDP are encoding 8K60 in realtime and the cheapest i series Intel x64 chips are transcoding a dozen 4K60 streams while the main CPU is idle 80% of the time.
Yes, I get bent out of shape a bit over the “professional” workload claims because I work in an engineering field. I run finite elements models and, while spare matrix solutions have gotten faster over the years, it’s still a CPU intensive process and general (non video) matrix operations aren’t really gaining all that much speed. Worse, I work in an industry with large, complex 2D files (PDFs with hundreds of 100MP images and overlain vector graphics) and the speed of rendering hasn’t appreciably changed in several years because there’s no pipeline optimization for it. People out there doing CFD and technical 3D modeling as well as other general compute-intensive tasks on what we used to call “workstations” are the professional applications which need real computational speed - and they’re/we’re just getting speed ratio improvements and the square root of the number of cores, when the software can even parallelize at all. All these manufacturers can miss me with the “professional” workloads of people surfing the web and doing word processing.
Indeed! It makes the benchmarks that much more disingenuous since pros will end up CPU crunching. I find video production tedious (it’s a skill issue/PEBKAC, really) so I usually just let the GPU (nvenc) do it to save time. ;-)
Yeah, I gave Apple a try over the last two years, largely because I was annoyed with Google and wanted to ditch Android. I’ve been fed up since about 6 months in, but gave it some more time, which led to an eventual waiting game to get the replacements I want.
I just picked up a Thinkpad P14s g4 AMD with a 7840u, 64GB of RAM, and a 3 year onsite warranty for $1270 after taxes. I added a 4TB Samsung 990 Pro for another $270. I can’t imagine spending more than that and only getting 8GB RAM (and less warranty), which is what I have assigned to the GPU. Plus I get to run Linux, which I really didn’t realize how much MacOS would leave me wanting.
The thing I’ll miss is the iPhone 13 Mini size. I found iOS to be absolute trash, but there’s just not an Android phone that’s a reasonable size. But at least I can run Calyx/Graphene on a Pixel and get a decent OS without spying.
I do like the M1 MBA form factor, too, but I’ll grab the Thinkpad X13s successor for portability and get a better keyboard. I don’t need top end performance out of that, I really just want battery life and passive cooling.
And don’t even get me started on the overpriced mess that are the Airpods Max. I much prefer the Audeze Maxwell and Sennheiser Momentum 4 I replaced them with.
Thanks a lot for your post ! The future of cars looks grim.
Serious and naive question: how could I get rid of the tracking at the hardware level when I will have no choice other than to buy a connected car?
Is there an antenna or a SIM card somewhere that I could disconnect/remove? Would the car continue to work if the connection to the manufacturer's server is lost?
For recent cars I am afraid you are right. My current and "old" car has a built in navigation system with the map on an SD-card. No need for a connection to a smartphone - which I do not own. Therefore I suppose it is not communicating with the manufacturer.
Then, someone in my family with a more recent car got several "firmware updates" out of the blue, hinting to a 'permanent' connection to the manufacturer.
I have the feeling we need to start organizing and claim a "right to disconnection". Having the car dial for help after a crash is one thing but what Mozilla's report describes is at another, much higher level.
Cars are built in modules, so there is definitely something you could disconnect to prevent it phoning home. You might need to take the dashboard apart though.
There is nothing preventing the car from starting and running without it. As long as you have a key fob it will attempt to start.
Thanks! Knowing that what I might be searching for would be somewhere under the dashboard is a good first step.
Then I am not an engineer nor have any experience in electronics BUT I know from my dad that taking the dashboard apart is not an easy task. If I would succeed I do not know what I would be looking for… Would tan antenna look like a piece of wire? Or could it be embedded in the 'copper' circuitry of a PCB? Do cars use regular SIM cards like the ones found in phones or would they look different?
The maintenance manual would probably be a good place to start before trying to put anything apart.
Every platform is different. The maintenance manual won’t tell you as it’s not part of maintenance. If you really want a piece of literature then you’ll need a factory service manual, but no offense if you don’t know what you’re doing you WILL cause damage to your vehicle (or even yourself if you accidentally mess with the airbags)
Valid point, no offense taken. I did not think about the airbags! As for damages to the vehicle, this is something I understand an am willing to accept. If I do stupid things I have to face consequences.
Anyway, getting the help of a mechanic would be point number 1 on my list. If can find one willing to take the challenge :)
I can't speak for the more modern cars, but my 2019 corolla had a cell phone connection which could be cut by pulling a single fuse. Idunno if it's a universal name, but it was called the DCM module. The emergency button in the roof was wired through it, and so was one of the right speakers and the built-in microphone. None of them work with the fuse removed. I'll route the speaker and mic wires around it at some point by going through the glove box, but it hasn't been a priority for me.
Yeah. I doubt they can have debates in person, either. But getting 7 people in a room so that the 2 highest paid ones can ideate all over each other while the other 5 nod along as a paid audience just feels better for those 2 than looking up to see the glassy-eyed stares of people who are trying to get their work done while sitting in on a pointless vanity meeting.
Wow, well said. I wonder if the arguement that "working in the office is important so that younger/newer employees can recieve mentorship" is just a theme and variation on the same thinking.
Not really. There is a lot of mentor activity that is easier in person because of how asking follow up questions works. In person it is easier to convey intent and ease worry for people doing new things just like how video conferencing can be easier than email depending on the topic. Or how mentoring is generally better than just giving someone a manual and not answering any questions for complex tasks.
That is not to say it is necessary most of the time, and the cries about everyone needing to be in the office because of mentoring doesn't make sense for people other than the mentor and new staff. That tends to be projection by people who can only handle communication in person.
Maybe it depends on the specific field, but I've had no issues mentoring people remotely, and even when I was in the office I was doing it via Teams half the time.
In many contexts it isn't that hard if you have the tools. The fact that many workplaces skimp on the tools is a them issue, not a mentoring issue.
A lot of people in this thread seem to downplay the article with “yeah, that might be your opinion…” but two facts that are facts and not opinions are:
The market share Firefox hold is insignificant.
Mozilla’s business is a near 100% dependency on one “customer”, Google.
This means that if Google decides to stop bank rolling Mozilla it’s game over. Firstly because other revenue streams are currently near insignificant when you look at the total expenses.
Secondly because since Firefox hold no significant market share, no one else would be interested in investing in Mozilla and the future of Firefox. After all, whatever Mozilla will throw up on the wall as the “grand masterplan for world dominance” would just end up in the question “Why didn’t you do this before?”.
I’ve been using Firefox for almost 20 years. I started using it because I saw what happens when one company controls the browser market. That web browser did so much damage and we only really got rid of it some year ago.
Chrome is a perfect example that the history repeats itself and that people are fucking stupid. People are actually acting surprised and complain about Google putting effort into making adblocking impossible in Chrome.
So all in all, if Mozilla doesn’t find other revenue streams, Firefox is dead… It just doesn’t know it yet.
Now, everyone yapping about that Linux was an insignificant player and still made it to the top just sound like enthusiasts who really doesn’t know history and the harsh reality of doing business.
Linux was just a little more than hobby project (business wise) that essentially only Red Hat and Suse made real money from in the 90’s.
Arguably you could say that the turning point was when the CEO of IBM, Lou Gerstner, shocked the world by saying that IBM was going to pump in 1 billion dollars in Linux during 2001. Now, that doesn’t look like much today when just Red Hat has a yearly revenue of 3-4 billion, but that’s how insignificant Linux was at that time.
After that milestone Linux went for the jugular on Windows Server. For ordinary people it would still take almost 10 years before they would hold something Linux in their hands.
The rocket engine that accelerated Linux and pieces that it was ready for end users was Google and Android in 2007. Linux’s growth the last 20 years wasn’t mainly driven by enthusiasts, it was business pumping in money in future opportunities.
What future opportunities can Mozilla sell to investors with the market share Firefox has today?
Arguably Google needs Firefox and co to not lose chrome in Europe due to anti monopoly rulings. Think that is sadly the best thing Firefox has to offer investors.
I expect Google to keep Mozilla/Firefox on the lifeline indefinitely to avoid antitrust issues in the states and EU, so Mozilla/Firefox won’t go anywhere.
Still, this doesn’t mean anything, because I often need Chrome or Safari to access some websites.
In the end it is quite funny: Moving a lot of stuff to the web made Linux a more realistic desktop option, at the same time to access a lot of stuff on the web one needs to run a Blink browser.
IMHO the most annoying thing is, that we could have at least some laws, which mandate that every government service must be available to Open Source users and every government paid software must run on at least Linux. Thanks to lobbying and power this will never happen.
Edit: To state it more clearly: Firefox is IMHO in bad shape and in a bad situation. Firefox won’t die, but at the same time right now I already need Chrome/Safari browsers, because Firefox support is broken on many sites. I see no way Firefox can gain significant market share, especially seeing what regular consumers tolerate from Microsoft/Edge and Google/Chrome.
One big problem, even if Google continues to pour money into Mozilla, is that more and more sites and systems drops support for Firefox. When I say “drop” I mean implement measures for making it harder to use a service if you use Firefox. Even Google does this.
Yup. Mozilla really needs to diversify and find new revenue sources.
They’ve been trying, but it’s proving difficult to do while still refraining from hoovering up and selling everybody’s data. Nobody wants to pay.
To make matters worse, anytime Mozilla tries to make any money, people accuse them of selling out or say they should just focus on Firefox. Some of these people even say that Firefox needs to get rid of Google funding immediately to get rid of Google’s influence.
But that means the death of Firefox. I don’t really get what these people want.
The issue is biggest for web browsers, but I also feel like I see that issue for a whole lot of web industries. Journalism, for instance. Everyone wants everything for free, and so the “articles” you see are garbage half churned out from algorithms to optimize click rate, and blanketed with dozens of ads. To take another example, games, we have a market saturated with freemium games that encourage people to spend nothing (and then hundreds). Pirates would now claim it’s a moral responsibility to pirate, but if we end up in that world, only a slim minority of people would ever make a living out of it.
The general unwillingness/inability for consumers to pay for digital content definitely causes a lot of problems now. I personally attribute it to a generally low minimum wage, but it could be an issue going beyond that.
These people want to be rid of Google’s influence, which is why they chose Firefox over Chrome to begin with. But they don’t understand the position Mozilla is in…
Some of these people even say that Firefox needs to get rid of Google funding immediately to get rid of Google’s influence.
Which is an importnt factor, because Mozilla is currently being kept alive specifcally to lose.
To be fair, those people (and lots others too) watch everyday some millionaires or billionaires just up and throwing money. Under that premise, it “should be as easy” as just convincing a random capitalist with narcissist complex to fund Mozilla. The problem with that is, people’s memory on the internet tends to not be retrospeculative, so they don’t notice if Mozilla did that they’d be in just about the same position eg.: Reddit was 5 years before 2023.
The rocket engine that accelerated Linux and pieces that it was ready for end users was Google and Android in 2007.
N-no. Correct about IBM though.
It seems that what made Linux and FreeBSD relevant was the late 90s’ and early 00s’ Web. And FreeBSD then lost to Linux, not to Windows Server or Solaris.
Linux’s growth the last 20 years wasn’t mainly driven by enthusiasts, it was business pumping in money in future opportunities.
Only there are different kinds of businesses, and the balance between them is becoming worse.
Before IBM made that statement there were essentially no major software vendors that ported and supported their software on Linux.
Yes, one might argue that Linux-Apache-MySql-Php revolutionized things but other than that a clear majority of things were run on solutions that put money in Microsoft’s pockets.
Feel free to name drop some major finance systems or similar enterprise systems you could run without Microsoft cashing in on the OS in some way between 1990-2005.
As I wrote before, it took us 20 years to get rid of IE and a lot of proprietary server side junk Microsoft blessed us with. It’s not an coincidence. 99% of all companies were stuck in development tools from Microsoft.
It wasn’t until the hardware really really caught up with Java requirements that things really changed.
I’ve just found mentions of Linux support by Oracle before that, so there were things before IBM and that statement. Though on that page there’s no Linux link, but there are AIX, Solaris etc and an NT one.
Feel free to name drop some major finance systems or similar enterprise systems you could run without Microsoft cashing in on the OS in some way between 1990-2005.
Could you please, on the contrary, name some such systems strongly requiring Microsoft really? IIS and AD are not that.
I mean, OK, for the thick clients for administrators likely it’d be many things.
But everything IBM or commercial Unix-based, like, again, Oracle databases.
I’m born in 1996, so don’t really know what I’m talking about. Just seems a bit skewed.
theregister.com
Hot