A photo of a tree, hiding a streetlamp in between its leaves.

I think uploading images should work now that I have a media endpoint. If everything works, here’s a picture I snapped recently.

  • ❀️0
  • πŸ’¬0
  • πŸ”„0
  • πŸ”–0

Since Kittybox (or more precisely, its IndieAuth library) doesn’t support omitting PKCE, I found myself unable to log into some apps like Together. Kinda sad, but that’s the price of being on the bleeding edge of the spec, with almost no support for legacy clients.

Had to add authorization_endpoint and token_endpoint links into the header though, because it might be that I am pretty much the only one to use the newest spec and properly implement it. For now.

Next step could probably be actually making webmentions work...

  • ❀️0
  • πŸ’¬0
  • πŸ”„0
  • πŸ”–0

test

if u see this i have implemented indieauth and can now do stuff

  • ❀️0
  • πŸ’¬0
  • πŸ”„0
  • πŸ”–0

Sometimes you have to scale your ambitions back and go slow to prevent yourself from overworking and subsequently burning out. It is sometimes hard to admit, but it is the truth of this world.

  • ❀️0
  • πŸ’¬0
  • πŸ”„0
  • πŸ”–0

I hate OpenSSL, but the webauthn-rs crate happens to use it. Ugh. I don’t want to ship C code in Kittybox! But it looks like rewriting the whole webauthn-rs crate to remove OpenSSL dependency might be the only option, if it’s possible at all.

  • ❀️0
  • πŸ’¬0
  • πŸ”„0
  • πŸ”–0

Oh well. It looks like I tried to implement an asynchronous template system for Rust, but instead of actual templates I ended up manually writing a stream of Bytes instances. It takes 10 times as much space as its output. But it’s asynchronous, and I expect that TTFB for, say, a large feed, would be much better with it.

For completeness, here is one template I wrote. It should’ve been an h-card.

pub fn card(mut mf2: serde_json::Value) -> impl Stream<Item = Bytes> + Unpin + Send {
    let uid: String = match mf2["properties"]["uid"][0].take() {
        serde_json::Value::String(uid) => uid,
        _ => panic!("h-card without a uid")
    };
    let photo: Option<String> = match mf2["properties"]["photo"][0].take() {
        serde_json::Value::String(url) => Some(url),
        _ => None
    };
    let name: String = match mf2["properties"]["name"][0].take() {
        serde_json::Value::String(name) => name,
        _ => panic!("encountered h-card without a name")
    };
    chunk(b"<article class=\"h-card\">")
        .chain(futures::stream::once(std::future::ready(
            match photo {
                Some(url) => {
                    let mut tag = Vec::new();
                    tag.extend_from_slice(b"<img class=\"u-photo\" src=\"");
                    html_escape::encode_double_quoted_attribute_to_vec(url, &mut tag);
                    tag.extend_from_slice(b"\" />");

                    Bytes::from(tag)
                },
                None => Bytes::new()
            }
        )))
        .chain(futures::stream::once(std::future::ready({
            let mut buf = Vec::new();
            buf.extend_from_slice(b"<h1><a class=\"u-url u-uid p-name\" href=\"");
            html_escape::encode_double_quoted_attribute_to_vec(uid, &mut buf);
            buf.extend_from_slice(b"\">");

            html_escape::encode_text_to_vec(&name, &mut buf);

            buf.extend_from_slice(b"</a></h1>");

            Bytes::from(buf)
        })))
        .chain(chunk(b"</article>"))
}

It’s huge. Here is the output it should produce (whitespace is mine):

<article class="h-card">
    <img class="u-photo" src="https://example.com/media/me.png" />
    <h1><a class="u-url u-uid p-name" href="https://example.com/">Jane Doe</a></h1>
</article>

I need some sort of macro system to work with these. The idea itself seems good, but the implementation... meh.

  • ❀️0
  • πŸ’¬0
  • πŸ”„0
  • πŸ”–0

This content is also featured in IndieNews, the IndieWeb news aggregator.

The new generation of my own website was in its early stages of development for way too long. Several years passed before I was able to finally ship even a proof-of-concept, and yet ambitious thoughts don't stop leaving my head. While not being able to use my website and fully engage with the IndieWeb, I was forced to regress to some older technologies, such as RSS feeds and traditional social network silos, and yet I think this might've inspired me to create something new.

This is a proposal for a new generation of social readers, built right into the browser and based on open standards, such as Microsub and Micropub; allowing the user to seamlessly transition from the old-style web that we know to the new generation of social web - self-hosted, self-sovereign and free of unneccesary corporate influence, while not being bound to inferior and redundant technologies such as the blockchain and the "Web 3.0" fad that it started.

The role of a modern web browser

The modern web landscape has significantly changed since the invention of the World Wide Web by Tim Berners-Lee in 1989. From a document sharing system it was then transformed by its users into a proto-social network of personal webpages that mixed graphical media with textual content. Then it was once again transformed by the "dot-com boom", which accelerated both development of the technology and its commercialization and centralization.

The modern web browser is now the center-piece of every computing device, since without the web, modern computing as we know it wouldn't exist. The Internet supports many usecases we have, from simple filesharing to videoconferencing, completely transforming our lives. And all of this in a single app. But... something is lacking here.

Current social networks present in the Internet landscape are mostly designed to show undesirable and irrelevant advertisments to users, and not to connect them and facilitate communication. Many services which were used by friend groups to communicate now transition away from a social network paradigm and turn into content-pushing machines, where the only choice the user has is to whether scroll down or stay on the current page. Control is being slowly taken away from users, turning what was intended into a primary means of communication and information exchange in the 21st century into a glorified TV with a touchscreen instead of buttons.

The fundamental concepts of "self-hosting" and "the social web"

But control can be taken back. Taking control of one's own social web experience and shaping it can primarily be facilitated through the concept of "self-hosting" - provisioning resources for oneself that facilitate information exchange and are controlled by the user instead of third-parties. Delegation of control is possible when neccesary and authorized - but data souvereignity is a must. Corporations come and go; their services may come down and never return. By taking control of one's own data and the responsibility for hosting content produced, an individual will gain the ability to fully control and curate their own unique online experience.

TheΒ IndieWebΒ community is based on exactly that thought and is building new Internet protocols to help people reclaim their space on the modern web. As part of their work, open standards and protocols were developed to facilitate the new generation of social web and data exchange, using a personal website as the centerpoint of data souvereignity and control. The user, being in control of their website, uses it to engage with other people on the social web while staying in control of the content they produce and consume, unlike current social networks, where accounts can be banned instantly with all their data gone, and instead of choosing things to read or watch or listen to, content is being forced down a user's throat by a black-box set of numbers masquerading as "artificial intelligence" (which is sometimes acting directly against its own moniker, lacking any true intelligence or understanding of human nature and humanity's wishes).

However, this concept, while being perfect otherwise, is incomplete. The level of integration between the IndieWeb and its protocols and the old-style web is lower than it could be, and the main point where the two can be reconciled is what we use the most to interact with the web - the web browser itself.

Current state of affairs in the social web

In the collection of protocols and concepts developed by the IndieWeb community, there is a certain one that stands out the most, encompassing one of the central concepts of any social network - the feed. It's called a "social reader" - an application, most commonly a web app, that presents to user a social network-style interactive feed or a set of feeds that allows not only to consume content, but actively engage and interact with it. It borrows from conventional social network experience, but uses modern IndieWeb protocols such as MicrosubΒ to let the user stay in control of their data and prevent any third parties from messing with it without the user's explicit consent or disrespecting the user's freedoms in any way.

The social reader allows one to curate a set of feeds filled with content, and then interact with them, posting replies, comments and notes (and even bookmarking whole articles, or expressing their appreciation of content with a "like" post, mirroring the "like" feature of conventional social network silos) to one's own website, allowing the user to stay in control of their own data and rely on third parties as little as possible while retaining the ability to interact with the wider World Wide Web. Sadly, being often confined to a web application, social readers are limited in their ability to interact with anything outside of the user's feeds, which limits the user's reach on the social web. While discovery engines based on syndication (such as indieweb.xyz, created by the community, or the old-style "planet" content aggregators) allow to expand that, the current experience of discovering new content can eventually take the user out of the social reader on a standalone non-social-web-aware webpage, where social interactions on one's own website are harder to facilitate. Solutions are being explored to remedy that, such as "webactions" - custom protocol handlers that indicate a prompt for an action to be performed inside of a social reader app and posted to the user's website.

However, webactions are not natively supported by browsers, requiring JavaScript polyfills and often degrading user experience because of that. The epitome of that concept would be integrating the social reader directly into the browser, allowing it to facilitiate social web interactions without any external client-side software.

A new generation of social readers

Modern web browsers include a "new tab" page that opens whenever an empty tab or window is opened. This experience can be redesigned to take users straight to their social reader, integrated directly into the browser instead of being a standalone web page. This will allow users to never degrade their experience, even when they're taken out of their reader to a standalone webpage - the browser could show buttons corresponding to actions that can be taken on the current page being viewed - for example, posting a comment on one's own website and then notifying the author using a Webmention, or syndicating the content to one's own website (commonly called "repost" in social network silo parlance), or simply bookmarking it as something interesting to refer to in later discussions, or for personal use.

Native UX should be designed so that the social reader doesn't feel like a wart on top of a browser, but a natural extension of it. Such a design could allow users to seamlessly interact even with pages that aren't aware of the new generation of "social web", since the user's website will still be able to retain their interactions with the old-style page.

Most browsers allow usage of so-called "Web Extensions" to augment the browser experience. Sadly, this often leaves the "extension" with minimal UI to show the user, aside from a single button beside the omnibox, or injecting itself into every webpage and projecting its UI in there, potentially breaking the page's layout in process. This leaves this mechanism ill-suited for integrating a social reader experience into the browser. Therefore, development of a new browser chrome, powered by one of the conventional engines such as Gecko or Blink, would be the most likely way to proceed with the implementation of this concept.Β 

Web Extensions could still be used to prototype and experiment with the concept. OmnibearΒ is an existing extension that allows one to author posts and interact with the social web. However, it was abandoned around 2019, and doesn't provide the social reader experience - only minimal ways to send interactions with foreign content to one's own website. Some of the concepts are similar enough to be reused, and inspiration could be taken from its UX.

The endgame

By fully taking control of one's own data, the user will gain control over their social web life. A modern web browser must be augmented with features to faciliate social web interactions to prevent UX degradation when inevitably landing on a page not aware of social web features. This will help users have a more pleasant and seamless experience on the social web, and help boost adoption by enhancing experience where social web interactions aren't natively supported by the websites themselves, due to ignorance, oversight or corporate malice.

  • ❀️0
  • πŸ’¬0
  • πŸ”„0
  • πŸ”–0

did I just start an outline for a small essay on modern web and social readers?

this will be interesting, I promise

  • ❀️0
  • πŸ’¬0
  • πŸ”„0
  • πŸ”–0

Wow, it turns out buffering file uploads and downloads in my media endpoint doubled my upload speed!

  • ❀️0
  • πŸ’¬0
  • πŸ”„0
  • πŸ”–0

It looks like Kittybox is close to its finish line and general protocol-compliance goal. The only unimplemented parts are:

  • In-house IndieAuth (auth and tokens)
  • Webmentions
  • WebSub pings

Then it will reach full protocol-compliance status, and I could go on to develop other things like pretty UI for posting, the Microsub server (because I really want my own!) etc.

  • ❀️0
  • πŸ’¬0
  • πŸ”„0
  • πŸ”–0

Big endian vs little endian is a pain. I had to go through two conversions to serialize an `std::net::Ipv4Addr` to a big-endian [u8; 4] - the middle conversion was a u32 of native endianness.

The most surprising thing is that the compiler optimizes this down to nothing. Maybe the IP addresses are already stored internally as big endian? Would make sense to store them in the same order they’re commonly used.

  • ❀️0
  • πŸ’¬0
  • πŸ”„0
  • πŸ”–0

That feeling when you write a unit test using a library and then accidentally discover a bug in that exact library instead of your own code...

The bug in question, if you’re interested.

  • ❀️0
  • πŸ’¬0
  • πŸ”„0
  • πŸ”–0

i really want to add webmention support and display into Kittybox right now so I wouldn’t be bored of not being able to see if someone interacts with me but i cant code for too much or i will blow a fuse in my brain become a dumb kitty for a week

so i will rest and play minecraft like a responsible person

see? im caring for myself!

  • ❀️0
  • πŸ’¬0
  • πŸ”„0
  • πŸ”–0

Next thing I should do since I fixed the bug: webmentions. I really need to handle webmentions, and I think I actually will be able to do so rather easily now that I know my database doesn’t lock up anymore. I just need to attach an MF2 parser.

In general, what I would want to have in a perfect world is:

  • My own IndieAuth implementation
  • My own webmention acceptor (I already have plans but I need some extra software for it to work)
  • My own media endpoint (that autoconverts pictures to webp)
  • My own Microsub server
  • Editing posts in-band when logged in via IndieAuth
  • Make that second widget at the homepage do something interesting
  • ❀️0
  • πŸ’¬0
  • πŸ”„0
  • πŸ”–0

No more bug. I squashed it for real now.

I should consider adding a regression test so it never shows up anymore. But is it worth it if I caused the bug by being stupid?

Maybe tests were in fact made to guard from stupidity.

  • ❀️0
  • πŸ’¬0
  • πŸ”„0
  • πŸ”–0