Storage of versioned websites for browsing on Safe Network

I haven’t looked into working with registers yet, but my hot take is that the concept of a website should be aligned with files / folders API. A (static) website is just a collection of files after all.

For example, being able to retrieve a file by its name from the CLI seems similarly important to doing the same via a web browser. (Edit: FUSE/ file system mounts would also need the file names too)

It feels like sn_client should be able to resolve these URLs internally, then retrieve based on XOR URLs to maximise caching benefits. Integrations could then just choose to use XOR or NRS naming.

Iirc, when i was looking at IMIM blog, I used the old NRS implemention to retrieve a map of XOR URLs, which represented the website. When something changed, a new map was created and NRS was updated to point to it. The blog pages themselves still use XOR URLs though. This meant one lookup and lots of caching was possible. It also meant viewing an old version of the site was straightforward too.

I suppose the question is whether resolving a site vs each file is desirable. The latter may make sense with today’s tooling, but it may not be optimal / specific enough in a safe oriented world. It can also be a source of 404s/different content, which is a goal to avoid. Is exposing XOR URLs actually desirable in this context?

1 Like