Kid_Thunder

@Kid_Thunder@kbin.social
Kid_Thunder,

I prefer my passwords in the form of blank, holy water, heart and axe in a 4x4 grid, thanks.

It meets MFA, right? Something I am (heart), something I have (axe), something I know (don't pick up holy water) and the amount of lives I have (blank). 16 characters for complexity.

Perfection.

Kid_Thunder,

Gnome's Boxes is pretty easy to use and of course uses qemu + KVM. This would be a type 1 hypervisor vs. Virtualbox's type 2. It is point and click like Virtualbox. You don't need to use Gnome's DE to use Boxes.

I have seen people post about your specific error for years when using the virtualbox website's repository instead of their own distro's repository (if it exists).

Kid_Thunder,

In Boxes, power down your XP VM, click Settings -> Sharing Panel -> Enable Sharing toggle. Click File Sharing and enable File Sharing. Power on the VM.

At that point you should be able to drag and drop from your host direct into your VM for a file transfer.

You can also click the vertical dots menu in the Guest's console "screen" and click Send File... menu option.

In the same menu you can click Devices & Shares -> Realtek USB or whatever -> Local Folder -> Select from the dropdown for the Host's folder that you'd like to share -> Save -> Make sure Toggle on the right is on.

Then your folder, I believe in XP, will show up as a removable drive like a USB drive would.

Kid_Thunder,

I have an app that alters the colours of my TV at night. I can’t imagine that’ll be possible in a web based OS.

It should be able to with no problem. It isn't that it is web-based but that essentially, a developer can use JavaScript in the React Native framework that exposes the OS. In other words, developers can use JavaScript to do things native apps can do. This isn't new at all and has been refined for years.

These JS frameworks do have a bad reputation for things like Electron (performance concerns) and node.js (security concerns).

I personally avoid software based on these technologies but then again, I also avoid Amazon devices too but some people love them.

Kid_Thunder,

I didn't know that ansible-galaxy had a comic

Kid_Thunder,

I actually posted that in science_memes a few days ago including other solutions as well as hardware passthru. People kept replying that it wasn't a solution because alternatively the lab doesn't have the expertise and somehow after 2 decades the only solution available is to continue to fight a losing battle of maintaining with no longer made hardware and also that source code availability would somehow just magically be maintained by magic software developers also interested in it after all of this time.
 
There's more goal post moving and some stretching assumptions in the responses but that's the ultimate gist.

It isn't that I'm again code rights dying with a vendor or even source code availability but I was merely posting that these types of problems are too common and solvable already outside of severe edge cases.

Kid_Thunder,

Back when I was a hardware engineer (embedded hardware, not really part of IT) for avionics, most of what I'd see where the interfaces weren't standard inside 'black boxes' were really just PCs on a motherboard with a 'bus controller' (not really a bus controller) that could be slotted into a PCI. You just have to pass the PCI from the hypervisor to the VM where the drivers and OS that uses it sits.

An issue that hangs some people up is some hardware that required an RTOS and was being virtualized is the CPU scheduler (due to vCores/HT/SMT) but those didn't run on Windows of course. My solution is just pinning the physical CPU and every odd core (if I can't just turn HT/SMT off) to the VMs with an RTOS. Works great.

Most data connections are just serial types and the Data|TX -/+ or TX|RX are simply swapped in the pin-out with a 'proprietary' formfactor that's easy to pigtail into whatever.

Maybe I should just go into business modernizing old lab and factory equipment's compute.

Kid_Thunder,

Americans (specifically gained ground with Noah Webster of the Mirriam-Webster Dictionary (originally titled An American Dictionary of the English Language) fame) changed a lot of words to standardize and simplify their spelling that was still phonetically similar. Keep in mind that many Americans at the time in the country spoke many different languages in different enclaves and that this took place soon after the US gained independence from Britain. Notably Webster learned 28 languages to study the entomology in order to facilitate this standardization.

For example, 'k' was dropped from the end of words like musick and publick, which was already adopted by the British public(k) commonly at the time anyway. Another example was dropping the extra 'l' (L) in words like travelling.

Then again, cheque became check and not chec. The British also use 'check' from the word 'eschequier' in the context of Chess, which is also where 'cheque' in paycheque comes from. A check against the king. A cheque against forgery. So why not 'chec'? Because 'check' was also commonly interchanged by everyone the world over anyway for checque and chèque in business before the United States existed. In business between many peoples, why add another word that may be confusing when 'check' is close enough to what Webster and others were trying to accomplish?

Looking at it through the lens of the time and the context of the US populace, it seems logical as many of the changes were readily accepted by the diverse population of the US. It may not now while merely considering it from today's perspective.

Kid_Thunder,

or even basic product management.

Googe Wallet (2011) became Android Pay (2015) became Google Pay (2018) became Google Wallet (2022), except in some places. Also, except in the US (and maybe elsewhere?) where Google Pay is still around but just to send money between people.

Google Talk (2005) and Google+ Messenger (2011) sort of became Google Hangouts (2013), which was part of Google+ (2011) which became Hangouts (2013), which became both Duo (2016) and Allo (2016) but then during both Duo and Allo became Hangouts Meet (2019) and Hangouts Chat (2019) which became Google Meet (2017 -- Yes, Hangouts Meet was still around) and Google Chat (2017 -- Yes, Hangouts Chat was still around). Google Allo died in 2018 and Duo died in 2022.

Inbox (2015) became a better gmail Android app than gmail actually was. Inbox discontinued in 2019 with the advertisement that gmail integrated Inbox's features (it didn't add most of them). This spawned other 3rd party gmail handling apps to take its place.

Google Play Music (2011) podcasts split into Google Podcasts (2018) stopped having releases in 2021 and rolled up/is rolling up into YouTube Music (2015). Google Play Music became YouTube Music in 2020.

Right now there's even Android Auto and Android Automotive simultaneously to pretty much do the same thing but are not the same. Android Automotive itself exists as Android Automotive with Google Automotive Services and also as Android Automotive without Google Automotive Services.

Android Auto for Phone Screens was replaced with Google Assistant's driving mode.
 
There are many, many, many more crazy branding issues but I just don't feel like continuing. Google has also killed at least 54 hardware lines, 59 apps and 210 services.

Kid_Thunder,

Secondly, you’ve combined app categories that don’t fit. Google+ was a social network, Hangouts was a chat app

Hangouts was originally part of Google+, hence "Google Hangouts (2013), which was part of Google+ (2011)"

If you don't recall it as a feature within G+, then at least trust an article talking about it.

Hangouts was third, a real-time video chat product embedded in Google+.

The Verge (2013) EXCLUSIVE: INSIDE HANGOUTS, GOOGLE'S BIG FIX FOR ITS MESSAGING MESS

Finally, you’ve conflated technologies. Android Automotive OS is an entire OS running in a car that is maintained by the OEM in much the same way as Android is on phones.
...Incidentally, this has nothing to do with Android Auto, which is an extended display for your phone.

I mention both as they are intended to provide the same functionality, regardless of the underlying technology -- integration of a vehicle's Infotainment with a Google provided ecosystem. In-fact, Android Auto apps are compatible with Android Automotive, because, technical 'why' aside, the function to the end-user is the same.

Google has been around for 25 years and always has chased innovation. They create a ton of things, see what sticks, then iterate or pivot.

According to many Googlers over the years, the reason many of these projects eventually discontinue and fail isn't because things 'aren't sticking' but rather due to the internal culture, in that to set yourself apart and get good performance ratings, you must always strive to be on teams that are doing something new. This leaves little to no resources for maintaining the 'old' regardless of how much people like them (or not).

While I too have been frustrated by the discontinuation of service I liked

I don't know about everyone else, but I wrote what I wrote, not because I'm frustrated about a discontinuation of any service I liked from Google. That happens. It is because the branding and evolution of products are confusing and sometimes, they even coexist. From my perspective, it often seems as if there is no actual long-term plan or guidance for many services that have come and gone with no signs of that changing.

The perception of the chaotic mess that Google brings with many of its services past, present and probably the future is at least something that I felt I wanted to criticize. They deserve it regardless of the supposed intentions behind the curtain.

Whenever I hear this kind of complaint, it sounds to me that people just want Google to be more like Apple or Microsoft and churn out minor improvements to their existing money makers with minimal innovation.

That's your opinion I suppose but it is not mine. My opinion is that Google should at least change the perception of their products to have clear and clean plans as they evolve. This would give me a reason to trust their branding more.

You mentioned Duo and Allo, which co-existed along with Hangouts for a time. The utter confusion and lack of interoperability created a confusing schism within the same userbase that used them at the time. You could argue that somehow they 'innovated' chat and video conferencing but they didn't even call one something like Hangouts Chat and Hangouts Video when they segregated the functions with a clear passover from Hangouts itself.

I think people would just prefer Google appears to be less arbitrary and in disarray about their products. If we are to believe some of the people that actually worked on these products, then that is going to require a culture change within.

Kid_Thunder,

Alright I know this is going to get some hate and I fully support emulation and an overhaul of US copyright and patent law but the justmeremember's supportive post is just bad. This is the same bad practice that many organizations, especially manufacturing, have problems with. If the ~20 years of raw data is so important, then why is it sitting on decades passed end-of-life stuff?

If it is worth the investment, then why not invest in a way to convert the data into something less dependent on EOL software? There's lots of ways, cheap and not to do this.

But even worse, I bet there 'raw' data that's only a year old still sitting on those machines. I don't know if the 'lab guy' actually pulls a salary or not but maybe hire someone to begin trying to actually solve the problem instead of maintaining an eventual losing game?

In ~20 years they couldn't be cutting slivers from the budget to eventually invest in something that would perhaps 'reset the clock?'

At this point I wouldn't be surprised to find a post of them complaining about Excel being too slow and unstable because they've been using it as a database for ~20 years worth of data at this point either.

Kid_Thunder,

Because it's often not worth the investment. You would pay a shit ton for a one time conversion of data that is still accessible.

Still accessible for now and less likely to be accessible as the clock ticks and less likely that there is compatible hardware to replace.

If it isn't worth the investment, then what's the problem here? So what if the data is lost? It obviously isn't worth it.

If the software became open source, because the company abandoned it, then that cost could potentially be brought down significantly.

OK but that isn't a counter point to what I said. If the hardware never fails, there is no problem either. What does that matter? And who cares if it was FOSS (though I am a FOSS advocate). What if nobody maintains it?

It doesn't matter because these aren't the reality of the problems that this person is dealing with. Why not make some FOSS that takes care of the issue and runs on something that isn't on borrowed time and can endure not only hardware changes but operating system changes? That'd be relevant. It goes back to my point doesn't it? Why not hire this person.

Clean room reverse engineering has case law precedent that essentially make this low risk legally (certainly nil if the right's holder is defunct).

You are also missing the parts where functional hardware loses support. Which is even worse in my opinion.

I didn't miss the point. I even made the point of having at least ~20 years to plan for it in the budget. Also the hardware has already lost support or there wouldn't be an issue, would there? You could just keep sustaining it without relying on a diminishing supply.

Or are we talking about some hypothetical hardware that wasn't mentioned? I guess I would have missed that point since it was never made.

Kid_Thunder,

I didn't say capitalism is perfect nor did I imply it.

So hypothetically let's say the vendor lost the rights to the software since it is abandonware -- great. I'd love it.

What changes for justmeremember's situation? Nothing changes.

I suppose your only issue here is that the software vendor or some entity should support it forever. OK, so why didn't they just choose a FOSS alternative or make one themselves? If not then, why not now? There is nothing that stops them from the latter other than time and effort. Even better, everyone else could benefit!

Does that make justmeremember just as culpable here or are they still the victim with no reasonable way to a solution?

I posted simply because this specific issue is much too common and also just as common is the failure to actually solve it regardless of the abandonware argument instead of stop-gapping and kicking it down the line until access to the data is gone forever.

Kid_Thunder,

Well, I think a better solution would be to deliver all source code with the compiled software as well. I suppose that would extend to the operating system itself and the hope that there'd be enough motivation for skillful folks to maintain that OS and support for new hardware. Great, that would indeed solve the problem and is a potential outcome if digital rights are overhauled. This is something I fully support.

What is stopping them now from solving access to this data, even if it's in a proprietary format?

Really, again, I don't take issue with the abandonware argument but rather with the situation that I posted itself. Source code availability and the rights surrounding are only one part of the larger problem in the post.

Source code and the rights to it, aren't the root cause of the problem in the post that I was regarding. It could facilitate a solution, sure but given that there is at least ~20 years of data at risk currently, there was also ~20 years of potential labor hours to solve it. Yet, instead, they chose to 'solve' it in a terrible way. That is what I take issue with.

Kid_Thunder,

It isn't necessarily a computer programming problem either. Rather it is an IT problem at least in part, one that the poster states is the primary job of his 'lab guy' -- to maintain two ancient Windows 95 computers specifically. That person must know enough to sustain the troubleshooting and replacement of the hardware and certainly at least the transfer of data from the own spinning hard drives. Why not instead put that technical expertise into actually solving the problem long-term? Why not just run both in qemu and use hardware passthru if required? At least then, you would rid yourself of the ticking time-bomb of hardware and its diminishing availability. That RAM that is no longer made isn't going to last forever. They don't even need to know much about how it all works. There are guides, even for Windows 95 available.

Perhaps there are other hurdles such as running something on ISA but even so, eventually it isn't going to matter. Primarily, it seems rather the hurdle is specifically the software and the data it facilitates though. Does it really have some sort of ancient hardware dependency?  Maybe. But in all that time of this 'lab guy' who's main role is just these two machines must have some time to experiment and figure this out. The data must be copyable, even as a straight hard drive image even if it isn't a flat file (extremely doubtful but it doesn't matter). I mean the data is by the author's own emphasis CRITICAL.

If it is CRITICAL then why don't they give it that priority, even to the lone 'lab guy' that's acting IT?

Unless there's some big edge case here that just isn't simply said and there is something above and beyond simply just the software they speak about, I feel like I've put more effort into typing these responses than it would take to effectively solve the hardware on life support side of it. Solving the software dependency side? Depending on how the datasets are logically stored it may require a software developer but it also may not. However, simply virtualizing the environment would solve many, if not all, of these problems with minimal investment, especially to CRITICAL (their emphasis) data with ~20 years to figure it out. It would simply be a new computer and some sort of media to install Linux or *BSD on and perhaps a COTS converter if it is using something like an LPT interface or even a DB9/DE-9 D-Sub (though you can still find modern motherboards, cards or even laptops capable of supporting those but also certainly a cheap USB adapter as well).

Anyway, I'm just going to leave it at that, I think I've said a lot on the subject to numerous people and do not have much more to add other than this is most likely solvable and outside of severe edge cases, solvable without expert knowledge considering the timeframe.

Kid_Thunder, (edited )

So again and again and again, I was not arguing against the abandonware issue. I take issue with how the problem is being stop-gapped in this current situation and not in some hypothetical alternate timeline.

Instruments like the ones we use are super expensive

Great. I didn't imply otherwise.

On top of that most people here barely understand computer and software

So the lab guy maintaining Windows 95 era computer's hardware, barely understands computers. Got it. I suppose this same lab guy won't be able to do anything even if the source code was available and would still being doing the same job.

What you’re suggesting is treating the symptoms but not the disease. Making certain file formats compatible with other programs is not an easy undertaking and certainly not for people without IT experience.

I didn't say it isn't. I said they've had ~20 years to figure it out. What would source code being available solve for them then? We could assume other people would come together to maintain it, sure. I've also talked about other solutions in replies. There are even more solutions. I wasn't trying to cover all bases there. It is just that within a couple of decades this has been a problem, there has been plenty of time to solve it.

Software for tools this expensive should either be open source from the get-go or immediately open-sourced as soon as it’s abandoned or company goes bust

Oh OK, so that makes it less complicated. I thought the assumption here is that, in general, anyone in that lab barely understands a computer or how software works. So, who's going to maintain it? Hopefully, others, sure. I actually do talk about this in other replies and how it is something I support and that, in this case, the solution is to deliver the source with the product. FOSS is fantastic. Why can't that just be done now by these same interested parties? Or are we back to "can't computer" again? Then what good is the source code anyway?

But again, that's a "what-if things were different" which isn't what I was discussing. I was discussing this specific, real and fairly common issue of attempting to maintain EOL/EOSL hardware. It is a losing game and eventually, it just isn't going to work anymore.

Even with plenty of funding to workaround the issue that shouldn’t be necessary, it’s a waste of time and money just so a greedy company can make a few extra bucks.

Alright, the source code is available for this person. Let's just say that. What now?

What can be done right now, is fairly straight forward and there are numerous step-by-step guides. That's to virtualize the environment. There is also an option to use hardware passthru, if there is some unmentioned piece of equipment. This could be done with some old laptop or computer that you've probably tossed in the dumpster 10 years ago. The cost is likely just some labor. Perhaps that same lab guy can poke around or if they're at a university, have their department reach out to the Computer Science or other IT related teaching department and ask if there are any volunteers, even for undergrads. There are very likely students that would want to take it on, just because they want to figure it out and nothing else.

There may be an edge case where it won't work due to some embedded proprietary hardware but that's yet another hypothetical issue at stake which is to open source hardware. That's great. Who's going to make that work in a modern motherboard? The person that you've supposed can't do that because they barely understand a computer at all?

In this current reality, with the specific part of the post I am addressing, the solution currently of sustaining something ancient with diminishing supply is definitely not the answer. That is the point I was making. There is a potential of ~20 years of labor hours. There is a potential of ~20 years of portioning of budgets. And let's not forget, according to them, it is "CRITICAL" to their operations. Yet, it is maintained by a "lab guy" who may or may not have anything other than a basic understanding of computers using hardware that's no longer made and hoping to cannibalize, use second hand and find in bins somewhere.

If this "lab guy" isn't up to the task, then why are they entrusted with something so critical with nothing done about it in approximately two decades? If they are up to the task, then why isn't a solution with longevity and real risk mitigation being taken on? It is a short-sighted mentality to just kick it down the road over and over again plainly hoping something critical is never lost.

Kid_Thunder,

Cause the instrument is important and replacing it, aside from being a massive waste of a perfectly functioning instrument, costs hundreds of thousands if not millions of € that we can’t spend

Why would you need to replace the instrument? You only need to replace the computers' functions. Why does it need to cost anything other than some other old workstation tossed into an ewaste bin years ago?

some dude on Lemmy said we shouldn’t use stop-gap measures for a problem that’s completely artificial.

As opposed to some dude on Lemmy bemoaning that there just can't be solved without source even though I've given actual solutions available now and for little to no material cost?

You have admitted that you'd still have to rely on someone else's expertise and motivation in the hopes that they'd solve the problem for the lab, yet, in my opinion, you're just discarding solutions that I've presented as if they aren't solutions at all because, at least in one of your points, that they'd have to rely on someone else's expertise and motivation in the hopes that they'd solve the problem for the lab. Even then, as I said, they've had decades to figure it out and there exist step-by-step instructions already that are freely available to help them solve the problem or get them almost to the end, assuming, there is some proprietary hardware never mentioned.

Anyway, I don't really have anything else to add to the conversation. So you can have the last word, if you wish.

Kid_Thunder,

Don't forget that there are many, many appointed superdelegates who each have around 8,000 voting power each.

There were 618 pledges from DNC superdelegates in the 2016 nomination, equaling 4,944,000 voting power (meaning votes equivalent to ~5 million regular voters in the DNC). These are not delegates assigned to states but to specific groups and people in positions in the DNC itself.

For reference, 16,917,853 of the popular vote itself went to Hilary Clinton and 13,210,550 went to Bernie Sanders according to this eye cancer of a website. If all of the DNC superdelegates voted for Bernie Sanders, he would have won the 2016 DNC primaries, even though the DNC voters regardless that the actual regular DNC voters voted for Hilary.

Anyway, I'm only making a point that system was broken.

The DNC did reform this afterwards, in that, if the first ballot doesn't have an absolute majority then superdelegates will cast votes but otherwise, cannot (as a superdelegate).

Fox News Left Shell Shocked by Dems’ Election Night Romp (www.thedailybeast.com)

A year after promising viewers a “red tsunami” in the 2022 midterms, only to be left with egg on their faces after the GOP drastically underperformed, Fox News was once again wondering what went wrong after Democrats romped to victory in statewide elections on Tuesday night....

Kid_Thunder,

“Democrats are trying to scare women into thinking Republicans don’t want abortion legal under any circumstances.”

  • Sean Hannity

I think the GOP did that themselves last year in regards to the 10 year old girl who had to cross state lines into Indiana to get an abortion.

Kid_Thunder,

I mean I could post the quotes of him supporting a no-exception national abortion ban and the quote of him saying that if he really paid for an abortion that there's no shame in that. But that's low hanging fruit. Instead, I'm just going for the fruit that already fell on the ground:

I’m this country boy. I’m not that smart.

  • Herschel Walker

And people say, ‘Herschel, you played football.’ But I said, ‘Guys, I also was valedictorian of my class. I also was in the top 1% of my graduating class in college.

  • Also Herschel Walker

So what we do is we’re going to put, from the ‘Green New Deal,’ millions or billions of dollars cleaning our good air up. So all of a sudden China and India ain’t putting nothing in there – cleaning that situation up. So all with that bad air, it’s still there. But since we don’t control the air, our good air decide to float over to China, bad air. So when China gets our good air, their bad air got to move. So it moves over to our good air space. And now we’ve got to clean that back up.

  • Herschel Walker again. This isn't a joke.
Kid_Thunder,

If there's an adblock I can't get around because, for example, I'm using a DNS server that blocks ads on my phone then I just create a snapshot in the Wayback Machine. For whatever reason, websites don't block them or they're just really good at circumventing them.

http://web.archive.org/web/20231107062355/https://www.haaretz.com/israel-news/2023-11-06/ty-article/german-journalists-detained-by-israeli-soldiers-asked-if-we-were-jewish-at-gunpoint/0000018b-a5ae-d9c0-a5fb-edfee9ed0000

Kid_Thunder,

It would be hard for the current Supreme Court to actually rule the protection of abortion rights since they leave it up to the states. Interestingly, Alito basically wrote in a slant that was very pro-state's rights to ban abortions specifically but it also does heavily imply to the point of being just shy of explicitly allowing the opposite but it must be what they meant or it doesn't make actual sense.

It would take a lot of logical gymnastics to essentially unwind and rewrite an opinion otherwise that doesn't go against their own majority opinion. Saying that, they did perform some Olympian gymnastics on not only Roe v. Wade but also Planned Parenthood v. Casey or in some instances, outright just say that they were plainly wrong.

They would essentially have to all but support a fundamentalist christo-fascist government (probably under the guise of what is in the best interest of the people, even against their own will) over even the Constitution itself and specifically the 10th Amendment and have a serious risk of impeachment unless he would opine that that it is the Congress' business to supersede that (Article VI), because that would also run counter to his written opinion of Dobbs v. Jackson Women's Health Organization (that it is the state's prerogative to regulate abortion and not the federal government's), unless it was specific that he meant it all narrowed specifically to the 14 Amendment and further would run counter to his own weaker federal government stance.

It would be far more likely for the SC to find that a state and its people have the right to regulate abortion as they see fit if they were even to decide to hear such a case.

TLDR; it'd be extremely risky and difficult to essentially give the state's the right to regulate abortion but take away unless those laws are only to ban them.

Kid_Thunder,

Is there a perceptible profit motive? No? Then we're wasting resources that could be used to chase things that have shareholder value.

  • Corporate "Ethics"
Kid_Thunder,

20.58% of the US population are baby boomers as of 2022. Source March 2023.
Alternatively, 17% of the US population is over 65. Source July 2022.
Though those aged 50+ are about evenly split between Republican and Democrat Leaning, which was surprising to me. Source.
38.6% of the US population are southerners as of 2022. Source.
Obviously, the above are not additive but other than Georgia, southern states are red for the last Presidential election in 2020. Source.

Republican leaning and Democrat leaning about the same at 44% vs. 45% respectively. Interestingly, both parties are the same at 28% of the sampling with 41% identifying as Independant as of 2022. Source.

The source for thehill's article is from a CBS News poll which may or may not have an even demographic sampling of the US population. However, the above stats do suggest that it is probably accurate enough.

What's really important is what percentage of these samplings actually bother to vote. Only 49.1% of 18 - 24 year olds in the US are registered to vote, 62.7% for 25 - 24 year olds...increasing with age groups until 75 or older, where there is a slight drop to 76.6%. Source 2022. However, there was only a 62.8% turn-out rate during 2022 and was considered a 'surge'. Source 2022.

I really don't want to deep dive in available statistics to start figuring out analytics and predictions. I don't do this for a living and I am willing to devote no more than 30 minutes to all of this. And unfortunately, the statistics really didn't show what I'd expect, which was going to probably be that we can just blame the Boomers. It shows a pretty even split amongst the population. Though the trend seems to be that the older you are, the more likely you are to be registered to vote but not necessarily actually vote in the US.

  • All
  • Subscribed
  • Moderated
  • Favorites
  • megavids
  • modclub
  • DreamBathrooms
  • mdbf
  • Durango
  • ngwrru68w68
  • magazineikmin
  • thenastyranch
  • InstantRegret
  • Youngstown
  • slotface
  • everett
  • kavyap
  • cisconetworking
  • JUstTest
  • ethstaker
  • tacticalgear
  • GTA5RPClips
  • osvaldo12
  • khanakhh
  • rosin
  • Leos
  • normalnudes
  • anitta
  • cubers
  • tester
  • provamag3
  • lostlight
  • All magazines