I am starting a new project that is intended to be designed as a (#p2p) protocol eventually with implementations in multiple languages. I know #Python well, but I have been learning #Rust and think I'll need to write at least some of the perf-sensitive components in Rust. Do I prototype it in Python and then rewrite in Rust later, or try and power through and write it in Rust now? #RustLang
edit: added hashtags fwiw
A bit of an overview and then I'll get into some of the more specific arguments in a thread:
This piece is in three parts:
First I trace the mutation of the liberatory ambitions of the #SemanticWeb into #KnowledgeGraphs, an underappreciated component in the architecture of #SurveillanceCapitalism. This mutation plays out against the backdrop of the broader platform capture of the web, rendering us as consumer-users of information services rather than empowered people communicating over informational protocols.
I then show how this platform logic influences two contemporary public information infrastructure projects: the NIH's Biomedical Data Translator and the NSF's Open Knowledge Network. I argue that projects like these, while well intentioned, demonstrate the fundamental limitations of platformatized public infrastructure and create new capacities for harm by their enmeshment in and inevitable capture by information conglomerates. The dream of a seamless "knowledge graph of everything" is unlikely to deliver on the utopian promises made by techno-solutionists, but they do create new opportunities for algorithmic oppression -- automated conversion therapy, predictive policing, abuse of bureacracy in "smart cities," etc. Given the framing of corporate knowledge graphs, these projects are poised to create facilitating technologies (that the info conglomerates write about needing themselves) for a new kind of interoperable corporate data infrastructure, where a gradient of public to private information is traded between "open" and quasi-proprietary knowledge graphs to power derivative platforms and services.
When approaching "AI" from the perspective of the semantic web and knowledge graphs, it becomes apparent that the new generation of #LLMs are intended to serve as interfaces to knowledge graphs. These "augmented language models" are joint systems that combine a language model as a means of interacting with some underlying knowledge graph, integrated in multiple places in the computing ecosystem: eg. mobile apps, assistants, search, and enterprise platforms. I concretize and extend prior criticism about the capacity for LLMs to concentrate power by capturing access to information in increasingly isolated platforms and expand surveillance by creating the demand for extended personalized data graphs across multiple systems from home surveillance to your workplace, medical, and governmental data.
I pose Vulgar Linked Data as an alternative to the infrastructural pattern I call the Cloud Orthodoxy: rather than platforms operated by an informational priesthood, reorienting our public infrastructure efforts to support vernacular expression across heterogeneous #p2p mediums. This piece extends a prior work of mine: Decentralized Infrastructure for (Neuro)science) which has more complete draft of what that might look like.
(I don't think you can pre-write threads on masto, so i'll post some thoughts as I write them under this) /1
This post is linked to the poster's website, https://jon-e.net/sfn23/ which embeds comments beneath this post. Comments below here serve as a discussion for those that can't be at the conference, or otherwise want to mark up the work.
edit 23-11-17: Added alt text, references, brief descriptions of related projects.
Reading the Swarm whitepaper, and I know this has already been said to death, but i freaking hate how the blockchain mfs absconded with the popular conception of #p2p and got it all filthy with their libertarian ancap stuff. Bittorrent culture is like "it is good to seed way more than you download for no reason," and despite the trivial neoliberal objection that "if there is no reason then nobody will do it" it freaking WORKS.
then the blockchain people come along and are like "any imbalance in bandwidth between peers needs to be settled by MONEY and you have to RENT the ability to share anything." and it's just a preposterously bleak world. like, because they can't imagine organizing anything together with other people they have to imagine a storage infrastructure that needs an economy to keep it running.
I have no interest in permanent immutable storage of anything because that is impossible and undesirable. I want to make a digital space that exists for as long as it is needed, and sure some domains in that like research data and whatever might need to last longer than others, but the preservation of that is always a social phenomenon, and I would way rather have that come from a place of shared belief in something that is important rather than being yoked to yet another zero-sum system of wealth and debt.
A #p2p future for the web is the radical idea that its bad to put all data in a single place owned by 3 companies and rented by a few hundred. The internet wasnt a mistake, the cloud was a mistake. Platforms were a mistake. A mistake where its not only possible but routine for "everyone's health data" to get stolen. https://infosec.exchange/@patrickcmiller/112341111375581551
I need to share my health data with like 3 people that arent me. Why on earth is that data in the same pile as literally everyone else's.
Here you will find my previous editorial activity, which, I have to warn you, is somewhat controversial, as I am unequivocally opposed to what is called ‘common-enemy’ politics, and favor broad alliances around constructive common goals, known as ‘common-humanity’ politics."
I wanna know what a new generation of #p2p looks like that's actively oppositional to the information economy. one that implements adversarial interoperability as a first class feature, that hops over API limits intended to keep you walled in gardens. one that makes a mockery of egress and storage fees. one where we don't plan to play nice, because the information conglomerates aren't either!
Have any #p2p people used #Keet ? Built on on #Hypercore. Looking for a good hole punching implementation and i was told this was a good one. I just tried it out between my laptop (behind university firewall/NAT) and cell phone (on mobile network), and I had one issue connecting, but after an update was able to connect in both directions no problem. No login, no account, no phone number, and allegedly e2e without needing a public server to coordinate the holepunch. need to check out how that's implemented. if this checks out i'm like damn this might be better than Signal (which i love)
I really can't help but look at this whole blocklist drama with some amusement tinged with exasperation.
Anyone who has done remotely any reading on #P2P systems and #federated systems just has that flaw jump to their eyes when the state of #ActivityPub implementations is seen, and in general how no importance is given to #AsynchronousCommunication communication in the spec and nearly as little to P2P use.
Basically, what did you expect? Of course it'll devolve into petty fiefdoms.
ok #p2p frenz, how would I go about simulating complex networks to like test hole punching and other connection systems? I have the resources for some hardware if needed, but ya would love to be able to be very explicit about like "these are the network conditions that we can punch through and these are not"
Are "hard links" important or a nice to have in a filesystem?
I'm trying to assess possible feature compromises for a #decentralized / #p2p based #filesystem that can be mounted on multiple machines, so any views on the usefulness of hard links with examples would be appreciated.
One such compromise here is merging of changes made from different devices, that will be much harder with hard linking.
What if there was a truly #p2p network where running a node was possible on a mobile or any device, and doing so would earn you anonymous encrypted storage on that same network forever?
Secure, anonymous storage for anyone with an internet connection. No gatekeepers, just you and a global network of nodes running on almost any connected device.
Текстовая меш-рация на электронных чернилах. Связь в городе 0.5-1км, по прямой до 7км. Экспериментально включён меш, через участников. Функционально как чат. Работает на нелицензируемой частоте. Готовим к краудфайндингу. Но готовы ранним последователям выслать за $250 за пару и $300 за 3 за подробный отзыв. 43 шт пока есть.
Starting to think about a #p2p social media protocol that doesn't fall foul of the UK's ridiculously broad #OnlineSafetyAct, and leaves what you see in your own hands.
Feed curation as active or effortless as you like, using whatever approach you choose.
So a protocol that provides the basis for user respecting apps with different approaches.
Goal: curation with zero effort via support for 'algos' that serve you rather than the other way around.
The definition of #P2P is really important. People tend to stretch it to suit the way they want to market their project. This confuses developers.
If a protocol only runs on servers, it can be p2p. But if those servers are used by a client, and that client doesn't have the same capabilities as the server, it's no longer P2P, now its a distributed system. Calling it p2p is intellectually dishonest.
This isn't debatable, it's computer science. There are no alternative facts.
Jami: A Versatile Cross-platform Open-Source Peer-to-Peer Decentralized Communication App
I can’t believe I last did a proper post about Jami as far back as 2019 last. I mention it a lot, and it was included run a debate today on my Friendica site, but I realised it does deserve a proper feature post of its own.