hrefna,
@hrefna@hachyderm.io avatar

The gist of what I am going for with :

What is a minimal subset that would allow us to treat the @'context object as a set of extensions rather than a set of processing directives while still remaining useful?

Basically, you could have a "mastodon postprocessor" with all of the logic there.

I have a bunch of thoughts and have validated some, but I'm interested in the experience of those who have banged their head against this problem. Any ideas? Does that make sense as a goal?

gugurumbe,

@hrefna In my frustration with jsonld, I started to write this: https://labo.planete-kraus.eu/neoas.git/tree/README

However, this is not very useful. I’m now trying to get the json-ld algorithms to work, with the idea that if I see a context I don’t know, I’ll just pretend it’s { "@context": {}}.

gugurumbe,

@hrefna Fetching contexts on the web at run-time is in my opinion a Very Bad Idea: what if the context document changes? What if it becomes malicious? What if it becomes private and applications start to inject obfuscated or encrypted data?

If you develop a new ontology, either it is mature and used, in which case I will include it in my json-ld library, or in a plug-in if it is very specific, or it is not, and you still have a choice to use it with a namespace instead of a context.

jenniferplusplus,
@jenniferplusplus@hachyderm.io avatar

@hrefna I think that's generally the right idea. In an ideal world, an ld context would facilitate code generation. But for all of its links, it doesn't actually describe anything with enough to structure for that to be possible, even in the most naive case.

For myself, I'm thinking of it as being advisory, at best. If I could query the @context to check for active extensions, that would be by far the most help it's ever been.

hrefna,
@hrefna@hachyderm.io avatar

@jenniferplusplus I have tried six different ways (not actually, more like 3) to get some form of code generation or metatemplate or something to work because it looks vaguely viable but every single time it runs into the exact problem that you are describing (along with that contexts themselves are a nightmare to process -.-)

I basically concluded that if I'm going to go down that road I need a grant sufficient to quit my job, if it is even possible to do in the first place.

jenniferplusplus,
@jenniferplusplus@hachyderm.io avatar

@hrefna I suspect that it's not possible in real usage. Maybe if every vocabulary was spectacularly well made, but they're not. The AS vocabulary doc is basically useless. It does nothing more than list the terms, without so much as defining or relating them to anything. And that's before we get to vocabularies that don't even exist. Mastodon's being a notable example.

KevinMarks,
@KevinMarks@xoxo.zone avatar

@jenniferplusplus @hrefna Paul Frazee (of BlueSky fame) came up with an alternative proposal designed for this use case as JSON-LZ a while back https://github.com/pfrazee/json-lz

hrefna,
@hrefna@hachyderm.io avatar

@KevinMarks

Yep, that project fits in a different niche than what I'm looking at: it's a different syntax entirely. A replacement for JSON-LD rather than something that interoperates with it as it is used in practice.

Another example is the largely defunct : https://litepub.social It was an attempt to clean up and simplify some the semantics around and provide a little more of a server specification along with it, but it doesn't use AS2 vocab.

@jenniferplusplus

jenniferplusplus,
@jenniferplusplus@hachyderm.io avatar

@hrefna
It's handy that several of these AP-but-performant protocols have been tried and failed already. Knowing that path is a dead end is saving me a lot of time and grief
@KevinMarks

KevinMarks,
@KevinMarks@xoxo.zone avatar

@jenniferplusplus @hrefna if you want a usable translator between different formats, Granary is the most complete https://granary.io/ - as used in https://brid.gy
Empirically, the JSON-LD model is not suited to actual interoperability, in that it has been more than 25 years and everyone who tries has vanished into further abstraction as far as I can see.

jenniferplusplus,
@jenniferplusplus@hachyderm.io avatar

@KevinMarks the problem is that activitypub is built on top of json-ld. So the options are to deal with that, or convince the entire fediverse to switch to something else. @hrefna

KevinMarks,
@KevinMarks@xoxo.zone avatar

@jenniferplusplus @hrefna No it isn't. It's built on JSON, with a sprinkling of LD cruft to placate the LD fans. You do not need to process JSON-LD or any other kind of RDF to use AS2 and AP, indeed this is explicitly stated in the spec, and you can leave the @context out if you want.
https://www.w3.org/TR/activitystreams-core/#syntaxconventions

hrefna,
@hrefna@hachyderm.io avatar

@KevinMarks

The two people you are talking to are intimately familiar with web protocols, how they work, and how to build and work with these systems in the real world

AP punts significant things to JSON-LD, ergo, if you want to be compliant with anything using those things you need JSON-LD. The "it's just JSON" line is almost pure marketing

Also we're discussing how these systems get used in practice. For instance, how to parse what is returned by https://xoxo.zone/users/KevinMarks

@jenniferplusplus

hrefna,
@hrefna@hachyderm.io avatar

@KevinMarks

Here, I'll help.

This is the context object for my account:

https://gist.github.com/dclements/cbb5b19e148a913392af26213a8e5495

Now, you can tell me "but you can omit the context object under the spec" all you like, but because things other than ActivityPub are being used, other specs also apply.

I can also point to parts of the minutes for the discussion in building AP in the first place where entire features were punted to JSON-LD such that you need to use JSON-LD for, e.g., language support.

@jenniferplusplus

hrefna,
@hrefna@hachyderm.io avatar

@KevinMarks

Now, I recognize you were around for those parts of it through about Aug 2016, but there seems to be a fundamental disconnect here between /what the spec says/ and /what is being done in reality/.

Between what guides are telling me I can do and what I need to do in order to effectively interoperate with what is out there.

I only barely care about the former, I do care about the latter, and what we have now is a weird mishmash that just confuses everyone

@jenniferplusplus

smallcircles,
@smallcircles@social.coop avatar

@hrefna @KevinMarks @jenniferplusplus

I proposed on that going forward we (W3C ) might consider to be a JSON-first spec, instead of -first. It should be a change of 'stance' and need not break backwards-compat (it likely won't or be low-impact as there aren't many LD impls).

This different approach would entail that JSON-first devs get a sane extension mechanism, and that independent of that LD devs can go the extra mile.

https://socialhub.activitypub.rocks/t/activitypub-a-linked-data-spec-or-json-spec-with-linked-data-profile/3647

gugurumbe,

@smallcircles @hrefna @KevinMarks @jenniferplusplus LD devs cannot “go the extra mile” if the open world assumption does not hold. The servers will simply drop all extensions and reject objects with multiple types.

smallcircles,
@smallcircles@social.coop avatar

@gugurumbe @hrefna @KevinMarks @jenniferplusplus

True. Something we were discussing today on the @fedidevs matrix chatroom.

Currently the OWA does not hold in our mostly json-only fedi. Or rather it holds if you do the per-project codebase deep-dive in order to be interoperable, and then commence with WDD (whack-a-mole driven development, fixing breakages against all these moving targets).

Yet couldn't open world LD take non open world JSON-only extension into account? Vice versa is harder.

hrefna,
@hrefna@hachyderm.io avatar

@smallcircles

I love WDD and am going to use that going forward.

I would note and underscore as far as WDD is concerned that at the moment it is incredibly difficult to even know to what extent JSON-LD is being supported.

Mastodon doesn't read contexts for most processing but does use some extensions w/ compaction etc, pleroma cheats and does WDD, misskey uses contexts as an extension mechanism, PeerTube actually seems to load things.

@gugurumbe @KevinMarks @jenniferplusplus @fedidevs

hrefna,
@hrefna@hachyderm.io avatar

@smallcircles

How do I know this? A lot of banging my head against git repos.

Virtually no one documents this explicitly anywhere. About the only way to even know to what extent they use JSON-LD, or whether that use is correct, is to run a test suite against it (which is a whole 'nother kettle of fish) or go into the depths of arcana of git histories and do a lot of searching.

@gugurumbe @KevinMarks @jenniferplusplus @fedidevs

smallcircles,
@smallcircles@social.coop avatar

@hrefna @gugurumbe @KevinMarks @jenniferplusplus @fedidevs

Yes, test suites would be great and might give implementation reports.

And some basic extension to the specs, what I call Compliance Profiles, where you learn what extension and FEPs an endpoint supports, with a link to their location (or entry in some location-independent registrar that has that info).

I found that XMPP has a Service Discovery XEP: https://xmpp.org/extensions/xep-0030.html

To discover fine-grained features: https://xmpp.org/registrar/disco-features.html

gugurumbe,

@smallcircles @hrefna @KevinMarks @jenniferplusplus @fedidevs XMPP and other messaging protocols like Matrix have to some extent different needs than what we develop on social networks. With instant messaging, you want to make sure the recipient reads exactly what you mean, no more, no less. That’s why you need to standardize everything.

It’s totally acceptable for each social network application that works on your social graph to not understand everything that happens.

smallcircles,
@smallcircles@social.coop avatar

@gugurumbe @hrefna @KevinMarks @jenniferplusplus @fedidevs

Yes, but still endpoints might tell you exactly what domain-specific extensions they support so you can figure out from that info how to develop integration with your use cases.

Having something like Compliance Profiles and Service Discovery is a move towards living documentation and specification, and placing a need for devs to focus on having docs in the first place. Thinking about their extensions more deliberately before coding.

smallcircles,
@smallcircles@social.coop avatar

@gugurumbe @hrefna @KevinMarks @jenniferplusplus @fedidevs

What the quality of those extensions is, is another matter. In a fediverse where not just the protocol is decentralized, but the development too in all kinds of different projects, devhubs and communities, everyone then has their own responsibility for quality assurance of that.

Good quality specs? Higher likelihood for broad adoption. We now already have devhubs for podcasting and federated forging developing their own extension spec.

smallcircles,
@smallcircles@social.coop avatar

@gugurumbe @hrefna @KevinMarks @jenniferplusplus @fedidevs

Btw @indieterminacy today mentioned and that protocol also has a similar extension mechanism:

https://www.rfc-editor.org/rfc/rfc8620

> 1.8 Vendor-specific extensions: Individual services will have custom features they wish to expose over JMAP. This may take the form of extra data types and/or methods not in the spec, extra arguments to JMAP methods, or extra properties on existing data types

gugurumbe,

@smallcircles @hrefna @KevinMarks @jenniferplusplus @fedidevs @indieterminacy For LD to work meaningfully, types of typed objects must be sets. Also, it is clear at the end of 1.8. that unknown extensions could change other data (so, no OWA).

I have no doubt that it is the right thing to do for JMAP, but I believe that the backbone for a social network needs to be more open.

gugurumbe,

@hrefna @smallcircles @KevinMarks @jenniferplusplus @fedidevs Hiding the json-ld complexity under the “Don’t worry, there are libraries to do that for you” does not work, as far as I understand.

by_caballero,

@hrefna @smallcircles @gugurumbe @KevinMarks @jenniferplusplus @fedidevs here's another git repo you could bash your head against if you're keeping a scoresheet of who uses LD how and where. it's also extremely interesting for 100 other reasons.
https://blog.mauve.moe/posts/distributed-press-social-inbox

KevinMarks,
@KevinMarks@xoxo.zone avatar

@hrefna @jenniferplusplus
I do agree with this wholeheartedly. My soundbite is that "Specs should be documentation, not legislation"
A lot of AP is wishful thinking of how something could work in an attempt to legislate it, and actual interop is based on working out what subset Mastodon is using, and what bits it insists you put in whether it looks at them or not. (Don't get me started on webfinger).
We're not quite at the 'WhatWG for Social' stage, but it's close

smallcircles,
@smallcircles@social.coop avatar

@KevinMarks @hrefna @jenniferplusplus

> "Specs should be documentation, not legislation"

I agree to the extent that machine-readable AP extensions that define both formats + behavior imho are a pipe dream (or they are the Semantic Web). So the best a spec can do is legislate a "minimum core", rock-solid extension guidelines, and a discovery mechanism for extension specs + docs + code.

Then everyone can do decentralized development in what is now and will stay an open grassroots ecosystem.

KevinMarks,
@KevinMarks@xoxo.zone avatar

@hrefna @jenniferplusplus I was part of SocialWG, though I spent more time on AS2 than AP, but AP does delegate language support to AS2 per https://www.w3.org/TR/activitypub/#h-note-2
Now, of course, what a spec says and what people actually implement does diverge a great deal (for example, the whole of the C2S part of AP).
My outbox doesn't have a @language in the @context, and manages to have both content and contentMap/en in the objects https://unmung2.appspot.com/jsontoxoxo?url=https%3A%2F%2Fxoxo.zone%2Fusers%2FKevinMarks%2Foutbox%3Fpage%3Dtrue

hrefna,
@hrefna@hachyderm.io avatar

Ultimately what this would mean is that you could say:

"I have a mastodon-v1 postprocessor and a security-v1 postprocessor. I see in the @'context that it is using both. I interpret it using the JSON-processor of my choice and can now apply both postprocessors to generate an expanded form—identical to the form generated by a -only processor—in linear time with trivial processing constraints. I can also do the reverse and turn it into a condensed form based on an equivalent processor"

  • All
  • Subscribed
  • Moderated
  • Favorites
  • random
  • DreamBathrooms
  • InstantRegret
  • ethstaker
  • magazineikmin
  • GTA5RPClips
  • rosin
  • modclub
  • Youngstown
  • ngwrru68w68
  • slotface
  • osvaldo12
  • kavyap
  • mdbf
  • thenastyranch
  • JUstTest
  • everett
  • cubers
  • cisconetworking
  • normalnudes
  • Durango
  • anitta
  • khanakhh
  • tacticalgear
  • tester
  • provamag3
  • megavids
  • Leos
  • lostlight
  • All magazines