As someone who is roughly in the same age group as the author and who was running a BBS, has witnessed the rise of IP4 networks, HTTP, Mosaic etc. let me provide a counter-point.
The democratization ends at your router. Unless you are willing to lay down your own wires - which for legal reasons you most likely won't be able to do, we will hopelessly be dependent on the ISP. (Radio on free frequencies is possible and there are valiant attempts, they will ultimately remain niche and have severe bandwidth limitations)
For decades ISP have throttled upload speeds: they don't want you to run services over their lines.
When DSL was around (I guess it still is) in Germany, there was a mandatory 24h disconnect.
ISP control what you can see and how fast you can see it. They should be subject to heavy regulation to ensure a free internet.
The large networks, trans-atlantic, trans-pacific cables, all that stuff is beyond the control of individuals and even countries. If they don't like your HTTP(S) traffic, the rest of the world won't see it.
So what you can own is your local network. Using hardware that is free of back-doors and remote control.
There's no guarantee for that. If you are being targeted even the Rasperry Pi you just ordered might be compromised.
We should demand from our legislators that hardware like this is free of back-doors.
As to content creation: There are so so many tools available that allow non-technical users to write and publish. There's no crisis here other than picking the best tool for the job.
In short: there's no hope of getting a world-wide, free, uncensored, unlimited IP4/6 network back. We never had it in the first place.
> We can build such a society. I am not sure why you think this is never possible.
Where does such informed political and economic interest and power exist? With whom do we construct such society? Do they have the power and will to fight for it?
Normies live with normie standards and with incresing social media exposure with more and more emotional animal-like manipulated world views. They are either ignorant or ambivalent.
Will tech people gather on a piece of land and declare independence? Most of my tech worker colleagues are also quite pro-social media and they heavily use it to boost their apparent social status. We cannot even trust our kind.
Similar examples of new technology being used to motivate and mobilize masses have always ended with devastating wars and genocides. Previously the speed of propagation of information gave advantages to statespeople like FDR to put an end to increasing racism/Nazism/violent tendencies (of course not everywhere, when let to its own devices new technology almost perfect for constructing dictatorships). Now everybody has equal access to misinformation.
Even in our fabulously wealthy societies, people are mostly worried about paying their bills, taking care of their families, and putting food on the table, not in getting together in a quixotic enterprise and paying for thousands of kilometers of communal fiber. Also like in most communist utopias what would probably happen is that the control infrastructure would be captured by special interest groups and now you’ve traded one evil for another, but in addition you’re left holding the capex bag and you’re poorer for it.
The technology is the easy part, the people are the hard part. The reality is that we simply don't have thought leaders in charge anymore, there's no innovations or anything that are coming to correct course, very few if any channels even exist anymore for good ideas to flow upwards that result in good & proper solution implementations that positively preserve/protect/harden what we want the web to be. I think a lot of bright minds who could be solutioning for some of these things understand the dynamics at play even if they've never taken a huge moment to think about it. Subconsciously they are aware that becoming a person to try and steer such a big ship would require a monumental exertion that is maybe not worth it anymore. The great leaders never actively seek out leadership positions, similarly I don't think the people who could be good decision makers and influence these types of ideals coming to fruition in society actively seek out such positions. The possible mental tax of getting there is probably enormous. It is not an economic win for anyone to take up the mantle of trying to steer ships this size, it is a massive sacrifice. People who would be fit for the task probably just want to sign off at the end of the day and... have a good life and exist/be a benefit in their communities. In some ways perhaps that makes them.. unfit for carrying this torch. Perhaps there are simply too few people out there that are adequately qualified to carry this torch, we are in dire need of competent people at the helm of many fronts and we simply don't have that, that's just the real life variables at play right now.
We plebs are just driftwood floating in massive waves of nation state decision making. I don't doubt there are people who literally work at ISPs who are depressed at the state of things, depressed that theyre not allowed to take action on certain things, depressed that they see first-hand what kind of control mechanisms they're forced to implement or disallowed from implementing and more. It's got to be a trove of BS in an age of misinformation which has always been an information systems problem that humanity has implemented checks notes zero solutions for. And at the end of the day they, probably like all of us, just want to live a good and meaningful life.
That's not to say just... give up on ideals. But instead to acknowledge the realities of ideals not being enough on their own. Have some real conversations on what it would actually take to embed these types of fundamentals into a society, get comfortable with the uncomfortable realities. So much work needs to be done before new ideals can even be shared. Outreach alone to spread ideals is a massive uphill battle at this point due to conglomerate control of broadcast media and concentrated ownership of social media apps. A lot of these particular ideals require a decent understanding/background of technology in general which most people don't even have, making these things an incredibly unlikely basis for a society where these things are well-enough understood. So the circus trick here is how do you make it a digestable topic that touches the souls of many and galvanizes them to take the correct stance so that these things become embodied in the set of ideals a society values, so that legislators and whatever other proxies that are tasked with decision making give these things the resourcing or policy making attention they deserve. That's the mega hard part, which is then additionally compounded in difficulty by ... most households in our societies just never having these types of discussions make it to their TV/computer screens. Hackernews types like to call these people "normies" and tack the blame on them, but they can't seem to wrap their mind around that not everyone could or should have a deep compsci background. We should be coexisting with people of a variety of backgrounds and instead we should be looking at their "normie"-ness as a thing to account for, not blame. It would be absurd to have a "normie" expect us to be exceptional at rebuilding car engines or any other broad subset of knowledge that we haven't ourselves committed our own lives/spare time to.
So that leaves the other route to take which is just... renegade fine-we'll-do-it-ourselves. Which can succeed, but has its own set of challenges. Fronting infrastructure for a lot of stuff is expensive, so donors are needed on sometimes vast scales. To another commenters point like... ain't none of us on the renegade front laying undersea cables any time soon which are multi-billion dollar projects to cross the Pacific. Often times we see these underground efforts fail in their infancy simply because the UX just flat out sucks and we're up against entities who can giga-scale all their infrastructure/resources & ultimately capitalize on making whatever app thing fast&pleasant for users. It feels like we're drowning against titans sometimes, it's overwhelming.
> The large networks, trans-atlantic, trans-pacific cables, all that stuff is beyond the control of individuals and even countries. If they don't like your HTTP(S) traffic, the rest of the world won't see it.
Not really having a plan here, so if nothing else this is out of curiosity, but I'd like to know who is actually owning that stuff.
For something that seems so ubiquitous and familiar like the internet, it would probably be good to understand who owns most of its infrastructure.
> there's no hope of getting a world-wide, free, uncensored, unlimited IP4/6 network back.
What do you mean "back"? It was never free, as in zero-cost. It was also not very unlimited; I remember times when I had to pay not only for the modem time online, but also for the kilobytes transferred. Uncensored, yes, because basically nobody cared, and the number of users was relatively minuscule.
The utopia was never in the past, and it remains in the future. I still think that staying irrelevant for large crowds and big money is key.
> We should demand from our legislators that hardware like this is free of back-doors
In some countries that may be possible (if only for now). Where chips are produced makes that an impossibility for most. That is, you can have certain guarantees if you run the chip fab, although if you are downstream of that, it can be a tall order to guarantee your chips are sovereign. So, while I like the sentiment that you have some sort of control behind your router, I'm really unsure how true that is given the complexity of producing modern day chips. Disclaimer, not an expert, just an opinion.
I would even say unless you truly have full custody of the transportation of components as well, that is unlikely. Israel’s pager bombs in Lebanon were supplied via a third party, not the manufacturer.
The upstream bandwidth sure improved but ISPs are still hostile to self-hosting by limiting ports, resetting connections every x days and not providing an ipv4 for a reasonable charge.
There's also no hope of creating a web that is resistant to enshittification and power consolidation as long as it can technically support any form of economic transaction.
I think the point the author is trying to make is more so about these mini networks on their own LAN, which their family uses. (And maybe dreaming of a neighbourhood utility LAN as a middle ground between LAN in your house and WAN as just a trunk to a big ISP node) The full quote is
- A Raspberry Pi 3B+ with a 3 gigabyte hard drive setup as a "server" (makes this site available on my home network[9])
- I publish this site via GitHub Pages service for public Internet access (I have the least expensive subscription for this)
...
[9] I can view my personal web on my home network from my phone, tablet and computers. So can the rest of my family.
For reals. I love the general premise behind the article, but to me how you publish it, and how others access it, is the sauce. Creating static sites is hardly the problem.
I remember a web when practically every ISP allowed you to have a "home page" hosted with them. Your home page was situated in the "public_html" directory of your home directory on their server — hence the name.
Then the URL was http://www.<hostname.domain>/~<username>
I haven't see an URL with a tilde ('~') in it in a long time.
Why did ISPs stop with this service? Was it to curb illegal file sharing?
The ironic thing is that each subscriber now has a dedicated computer many times more powerful than what ISPs had back in the day for hosting ALL those websites sitting in the most privileged part of their network, being online 24h and begging to be used for small hosting tasks like this: their ISP provided router. It even serves it's configuration panel through html and a webserver for crying out loud!
Unfortunately reality is such that those are closed systems with historically abhorrent security and ISPs usually forbids the user from properly providing their own choice of router.
i think Apache sets this up by default or use to. Every user on Linux would get a www, or maybe it was htdocs, folder in their home directory when they were added to the system. Any file you put there is served by Apache at resource /~<username> which was reading from /home/<username>/www on the file system.
There use to be lots and lots of ISPs and so they were small enough to have a single webserver with all their customers setup as users and Apache serving content. They'd also setup FTP on the same server so you could get your html files into your www folder. Software like Dreamweaver had a ftp client built in, so you'd click like a "publish" button and it wold login to FTP and transfer your files.
i would imagine this went away because it got expensive as the customer base grew and ISPs consolidated and it made no money. Other options with php, mysql, and other services cropped up and could offer more and charge for it so I think ISPs just preferred to concentrate on network access and not hosting websites.
Apache doesn't have it on by default but easy to turn it on. It's called usermod or mod_user. By default it's the ~/www directory. So, anyone with /home/<name>/www ends up being site.url/~<name>/
It is also possible to add .htaccess and other things there, like username/password challenge (WWW-Authenticate) into that on per-user basis.
Mostly universities had hosting setup the same way. ISPs would also offer a similar thing with an additional fee to your internet-subscription. They mostly provided FTP to upload files. Nowadays if anyone tries to, it will be a SFTP rather than FTP.
I actually haven't had a homepage for a long time because of the lack of the easy "put my home directory on the web", but I'd like to go back now to doing that.
Think it had more to do with the consolidation of the ISP space.
I used to have my choice of dozens of ISP's. Now if I am lucky I might have 2 or 3 from very large companies that did the math on keeping that going. It mostly happened when ADSL and cable took over. In most areas that meant only 2 or 3 companies could actually provide anything at speeds their customers wanted. Think at the time they always said it was cost cutting.
As the author of a content management system I made with the idea to democratize internet content creation, I've had a lot of the same thoughts that the author brings up here. I've always thought that even learning Markdown was a bridge to far when it comes to empowering non-technical users however. In my experience it's best just to supply tooling similar to Word where you have buttons for things like lists and bolding. Using Markdown as the format itself is something I will agree with though.
Another thought I had is that local AI could most definitely play a part in helping non-technical users create the kind of content they want. If your CMS gives you a GPT-like chat window that allows a non-technical user to restyle the page as they like, or do things like make mass edits - then I think that is something that could help some of the issues mentioned here.
It's definitely an approach. I do think in true democratization of the internet, teaching people some tech is inevitable. We just can't have equal access if we retain the classes of user and maker as completely distinct.
The real barrier was never technical. It was convenience and discovery. Running a Pi at home is trivial for anyone on HN, but the moment you want people to actually find your stuff, you need DNS, a stable IP, and some way to not get buried under the noise.
Tailscale and similar overlay networks have made the "accessible from anywhere" part way easier than it used to be. The missing piece is still discovery. RSS was the closest we got to decentralized discovery, and we collectively let it rot. Maybe it's time to bring it back properly.
I think the key issue here is that Attention is a temporal construct, meaning discovery is often tied to "being the first thing that comes to people's minds" which means SEO, reverse engineering the ranking algorithms, and constantly having to manage an "online persona". Note none of those things contribute to the actual work you're doing, just your "marketing department" (and whatever time/financial "budget" you intend to give it).
MrBeast figured out the YouTube algorithm - post early and often. Is that how we exist on modern Internet when every website/thumbnail is engineered by a team to maximize clickthrough rates? I agree RSS is useful, but it faces the same scalability issues if everyone starts filling up your RSS feeds. Given the limited amount of time you can devote to a particular task, we'll return to the era of A/B testing Headlines.
I enjoyed the article, but I’m skeptical of the “democratize via hardware + networking” path. Most people won’t run a Pi, manage updates/backups, or debug home networking, and that’s fine (as you note).
But I do think we’re reaching a turning point on the software side. The barrier to building custom, personalized apps is trending toward 0. I’m not naive enough to think every grandma will suddenly start asking ChatGPT to “build me an app to do XYZ,” but with the right UX it can be implicit. Imagine you tell an assistant: “My doctor says my blood sugar is high. Research tips to reduce it.” -> it not only replies with tips, it also proactively builds a custom app (that you own and control) for tracking your blood sugar (measurements, meals, reminders, charts, etc.). You can edit it by describing changes (“add a weekly trend graph,” “don’t nag me after 8pm,” etc.).
This doesn’t fully solve your Big Co control issue (they own the flagship models today), but open-weight + local options keep improving. I'm hopeful we have a chance to tip the scales back toward co-owner and participant.
This post resonated with me. I have tried self hosting multiple times over the years and always gave up cause it is so hard to manage and there was always an online service that was good enough.
This weekend I vibe coded (dont shoot me) a homelab platform that hosts a bunch of useful services on a MacMini and lets me deploy my own apps on top of. Using tailscale I can access the apps from my phone. I have multiple users with their own SSO to control access. I even have a pi as part of the network that hosts public facing content. All done with Claude Code and OpenClaw (as a kind of devops tool)... hardly any code written by me. Its been a seriously fun experiment that I will try to progress some how.... if only because I love the dream "Digital sovereignty" even if the reality is its unlikely to happen again. It got me thinking though if I could get inference hardware and a good enough open LLM to work with my setup it might just be possible. The OP advocates a form of basic computing that is understandable but when we are able to host our own LLM's we could end up in a very different but more capable paradigm.
Even this is hard. Most people don't know what they want, and/or they don't know how to describe it/imagine it. They don't even know what a trend graph is.
They just want someone else to do the mental effort of creating a nice product. Hence iOS > android for most people. They don't want to customise basically anything other than colours.
That's why i predict Lovable/replit etc will not go mainstream. And why chatgpt will just offer you their UIs mainly. Artifacts weren't a big hit
If you've never worked in a call center, or in a technical support role, it can be hard to understand just how inarticulate people are on average. Even programmers.
...And how much brainpower goes into understanding what people like this are getting at when they speak about things. There's a lot of context and human element to this; I'm skeptical AI will be any good at it in the near future.
Truly democratizing the web requires that "Compute Server" becomes a typical home appliance that is no more difficult to use than an oven or a furnace including the widespread access to vocational technicians who come to your house and fix it for you.
I mean -- my (completely non-technical) mother, after a few hours of my guidance, has started vibe-coding apps and websites for her local community organizations. And, like -- it works.
I am in the process of co-founding a new protocol which creates a decentralized root of trust using normal plain-text names (i.e. `foo.bar`). One of the goals I hope to obtain is allowing domain-style lookups of private websites hosted on P2P networks. It's lofty, but the dialog used by OP is _very_ close to why I think it's necessary.
I've been interested in doing something similar in the past, but I could really never solve issues like domain squatting and stopping individuals from claiming every name possible. Do you have a place where you keep these plans or have discussions around it? Or even just a place where I could get updates if anything does come of it?
> I've been interested in doing something similar in the past, but I could really never solve issues like domain squatting and stopping individuals from claiming every name possible
I think that's just a property of a naming system. Without something like a centralized threat of force that can know every person participating in the system - there really is no recourse. The approach we are taking is making it difficult to create a speculative market around names which seems to be the driving force behind squatters.
Happy to discuss it in more detail: hackernews@sepositus.com
(Note: that's an alias that goes to my email address which I avoid putting in public places for obvious reasons).
Yes, the protocol is going to be open and the plan is to submit PRs to various projects when we're at that point. It's not really an "either or" with ygddrasil but more like a "both and."
It's not about ease of publishing. The issue is what people get in return for publishing. Until you can design a platform that gives top creators as much money+attention as commercial platforms, you'll see a drain of top creators and their viewers to commercial platforms.
100%. You don't even need to give people money. It's about attention and feedback.
People post photos on Instagram and status updates on Facebook because their friends will see it there and give it a thumbs up.
A couple of decades ago, I spent a lot of time laboriously building a website for scratch for my photography. It was objectively a really nice site. I had my own domain, hosted it on a VPS, and put a ton of work into the layout and design.
But none of my friends ever thought to go there. I could see by my web stats that every now and then a random stranger would find the site... but they had no easy way of connecting with me and acknowledging that they saw it. If they put a lot of effort in, they could find my email address and email, but that's a hell of a lot harder than just clicking a little thumbs up button next to a Facebook post or filling a comment in the comment box.
Uploading photos to my site was about as rewarding as printing them out and throwing them in the trash. I thought about adding support for that to my site, but then it opens the whole can of worms around user-generated content, abuse, moderation, etc.
Eventually, I moved to Flickr, which at the time was an actual community that gave me that connection. Then Flickr fizzled out. Now, on the rare times I bother to process a photo... I just upload it to Facebook because that's where (a dwindling subset of) my friends are.
It's not about the content. It's about the human connection. A CMS won't fix that.
Feedback maybe, but blogging didn't start for attention. That's something that got bolted on by a nasty virus we as humans tend to be carriers of. I don't think feedback was even an inspiration for the initial bloggers.
No, I don't think that's true. I was active in the early blogging days and writing blogs as a response to other blogs was a really common pattern and part of the way the community functioned. It was sort of one big distributed conversation.
Certainly, it's fundamental to human nature that if we work hard to create something, we want some way to tell that another human was moved by it.
The only way we own a web of our own is to develop much more of a culture of leaving smallish machines online all the time. Imagine something like Tor or BitTorrent, but everyone has a very simple way of running their own node for content hosting.
That always-on device? To get critical mass, instead of just the nerds, you'd need it to ship with devices which are always-on, like routers/gateways, smart TV's. Then you're back to being at the mercy of centralized companies who also don't love patching their security vulnerabilities.
(1) Security. An always-on, externally accessible device will always be a target for breaking in. You want the device to be bulletproof, and to have defense in depth, so that breaking into one service does not affect anything else. Something like Proxmox that works on low-end hardware and is as easy to administer as a mobile phone would do. We are somehow far from this yet. A very limited thing like a static site may be made both easy and bulletproof though.
(2) Connectivity providers should allow that. Most home routers don't get a static IP, or even a globally routable IPv4 at all. Or even a stable IPv6. This complicates the DNS setup, and without DNS such resources are basically invisible.
From the pure resilience POV, it seems more important to keep control of your domain, and have an automated way to deploy your site / app on whatever new host, which is regularly tested. Then use free or cheap DNS and VM hosting of convenience. It takes some technical chops, but can likely be simplified and made relatively error-proof with a concerted effort.
Both or those are solved by having a tunnel and a cache that is hosted in the cloud. Something like tailscale or cloudflare provides this pretty much out of the box, but wireguard + nginx on a cheap VPS would accomplish much the same if you are serious about avoiding the big guys.
If I'm reading the implication right, you're having a pretty terrible idea. Glossing over what running a server would do to your battery, it would never work because of the routing issues you'll run into.
With IPv6 it would theoretically be possible, but currently with ipv4 and NATs everywhere, your website would almost never be reachable, even with fancy workarounds like dynDNS
The irony of this being fully hosted on GitHub should not be lost. A toaster is sufficient to host a mostly static site, a VPS would be far more than sufficient.
Owning things isn't free (and a VPS isn't owning things, either)
I absolutely agree with the concept, but people have to be ready to do their own work rather than delegating it to other parties. Consolidation has happened because these massive conglomerates absorb operational complexity on the cheap, and that's attractive. Moving away from them means we take on the responsibility of doing it ourselves.
You know what they say about the kind of services FAANG gives away for free...
And yes, I get the practicality of it. However, when people are actually doing shit like this[1] in the real world, writers of manifestos might consider practicing what they preach a tad more.
Well, you pay for it at least, and hence enter into a contract with the service provider. A free GitHub account is a come-on by Microsoft to enmesh you further in the world of hosted services - the precise thing this manifesto is complaining about
The cynic in me wants to say that most of the web these days is pushing H.264 frames from a CDN to proprietary phone apps and the rest is pushing Widevine video from the same CDN to proprietary browsers and we'll never cooperatively own any of that, even if we wanted to.
The idealist in me says we should still build a simple to use publishing and discovery system for hypertext that can be self-hosted and self-networked for the day the next generations realize they need it (authoritarian control of the Internet, collapse of social media, infrastructure instability, climate apocalypse, whatever). I suppose my idealism is still pretty pessimistic, but then it is Monday.
I kind of resonate with a lot of things in the article. My own personal view is that we should make hosting stuff vastly simpler; that's one of the goals of my project, at least my attempt (self promo)
Thanks cap'n-py. Yeah, I love Sandstorm. My goal is to be more portable, lighter, and a 'download binary and run' kind of tool. There are also other attempts around what I call the 'packaging with Docker' approach (Coolify, etc.), which are more attempts at packaging existing apps. But my approach—the platform—gives a bunch of stuff you can use to make apps faster, but you have to bend to its idiosyncrasies. In turn, you do not need a beefy home lab to run it (not everyone is a tinkerer). It's more focused, so it will be easier for the end user running it than for the developer.
I think the main issue with federated apps is the identity and moderation. Without identity verification is hard to moderate so you end up with closed systems where some big CO does the moderation at an acceptable level
The current wave of AI agents is diminishing the value of identity as a DDOS or content-moderation signal. The formula until now included bot = bad, but unless your service wants to exclude everyone using OpenClaw and friends, that's no longer a valid heuristic.
If identity is no longer a strong signal, then the internet must move away from CAPTCHAs and logins and reputation, and focus more on the proposed content or action instead. Which might not be so bad. After all, if I read a thought-provoking, original, enriching comment on HN, do I really care if it was actually written by a dog?
One more half thought: what if the solution to the Sybil problem is deciding that it's not a problem? Go ahead and spin up your bot network, join the party. If we can design systems that assign zero value to uniqueness and require originality or creativity for a contribution to matter, then successful Sybil "attacks" are no longer attacks, but free work donated by the attacker.
> if I read a thought-provoking, original, enriching comment on HN, do I really care if it was actually written by a dog?
I would rather just read the thought as it was originally expressed by a human somewhere in the AI's training data, rather than a version of it that's been laundered through AI and deployed according to the separate, hidden intent of the AI's operator.
Unfortunately the transparency of the IP stack means that unless u want whole world to know where u live via one DNS query, you'd need to use a service to proxy back to urself. And if ur paying for remote compute anyways, you could probably just host ur stuff there. Any machine that can proxy traffic back to you is just as capable of hosting ur static stuff there.
This guy has been around long enough to know about NNTP, which is the original distributed people-focused web, but talks about how HTML is some kind of barrier to entry.
The challenge I've always felt, is shared services -- if I'm running infra myself, I can depend upon it, but if someone else is running it, I'm never really sure if I can, which makes external services really hard to rely on and invest into.
Maybe you can get further than expected with individual services? But shared services at some point seem really useful.
I think web2 solved that in an unfortunate way, where you know the corporations operating the services / networks are aligned in some ways but not in others.
But would be great to have shared services that do have better guarantees. Disclaimer, we're working on something in that direction, but really curious what others have seen or thinking in this area.
We got here iteratively..not all at once. So the path back...it's iterative. I shouldn't even say back. We're not going back. We have to go in a new direction. And again it's evolutionary. So ultimately a lot of these big systems and big tech companies aren't going anywhere and they will be integral to all infrastructure for the foreseeable future whether that be technical, financial or related to public services. But as individuals we can slowly shift some of our efforts elsewhere in ways that it might matter.
Here's my small contribution to that. https://github.com/micro/mu - an app platform without ads, algorithms or tracking.
I agree with the point that big companies have persuaded people that only they can offer ease of publishing content. most of my friends publish on Facebook, X, Instagram etc.
I have tried to get them to publish markdown sites using GitHub pages, but the pain of having to git commit and do it via desktop was the blocker.
So I recently made them a mobile app called JekyllPress [0] with which they can publish their posts similar to WordPress mobile app. And now a bunch of them regularly publish on GitHub pages. I think with more tools to simplify the publishing process, more people will start using GitHub pages (my app still requires some painful onboarding like creating a repo, enabling GitHub pages and getting PAT, no oAuth as I don't have any server).
But it is portable. It is essentially markdown files. You can download your repo, compile the Jekyll to static pages and publish them anywhere.
When you publish to Facebook, WordPress etc you can't easily get your stuff out. You will have to process them even if they allow you to download your content as a zip folder. The images will be broken. Links between pages won't work etc.
Facebook provides a data export service which gives you a zip file with a web version of all your content. I’m not sure what the difference is then between that and a Github hosted repository of all your content as a webpage.
The main difference is the data structure and the intent of the export. Facebook's tool is built for data compliance and local offline viewing, not web portability.
If you open that Facebook zip file, the HTML version is just a massive dump of proprietary markup. To actually migrate those posts to a new blog, you'd have to write a custom scraper just to extract your own text from their messy div tags. If you use their JSON export, you still have to write a custom script to parse their specific schema and remap all the hardcoded local image paths so they work on a live server.
With a Github Pages repo, your content is already sitting there as raw, standardized Markdown. You can just take that folder of .md files, drop it into Hugo, 11ty, or any other static site generator, and it just works. No scraping or data-wrangling required.
This is all fine and dandy for websites but what we’ve really been locked out of is email.
You can’t run your own email server. All other large email providers will consider your self hosted emails as spam by default. It understandable why they took this stance (due to actual spam) but it is also awfully convenient it also increases their market power.
We are now at the whim of large corps even if we get a custom domain with them.
I think this mostly misses the biggest reason why writers would choose big tech platforms or other big platforms: discovery and aggregation. If you want to speak to be heard and not just for its own sake, then you want to go where the people are hanging out and where they could actually find your content.
This is like talking about how book authors don't need Amazon when you have a printer and glue at home.
I welcome everyone who wants to imagine a better web. One with less control by greedy mega-corporations and data-sniffing state actors. I am not sure how such a web should look, but I am pretty certain we will need it sooner or later. Otherwise we may end up with "hey, we were in the 1990s generation, we knew a free web - now this has been replaced by walled corporate gardens controlled by a few superrich".
They totally suck like tiny homes?
No, actually they are better than tiny homes.
Browser are the #1 reason why you want a computer that's better than a Pi 500. Wanting to play modern games is #2.
Co-ownership of the hardware is a social not technical problem. Think of questions of trust, responsibility, who has power, who and how contributes, how decisions are made, etc, etc
one note. Even when we all wrote html most people still used a bigco to host it. Maybe if you go back far enough you can say you don't think regional ISPs are "big" but those companies are all gone now
It’s not really covered, but p2p technology combined with every phone in the world (and a little wishful thinking) could make for some neat applications.
I mean, you do have a point, and I'll quite agree with it. The only way of monetizing your writing is to use Substack or Medium, or whatever.
Yet your approach is appallingly low on the other side of the spectrum. I've been in IT for the past 25 years. I have yet to see a non-IT person who knows what dedicated IP is. If you are not publishing it on the internet, then what's the point?
I've seen plenty of companies where the owner just had a read-only shared drive, where people can rummage thru a pack of PDFs. This' was all fine with that.
You have to understand, manage and work with the complexities of the tools, and offer tools quite enough for the task. It's alright to offer what you do to an engineer who has a spare Pi and a couple of days to kill. But it's quite useless for anyone else to adopt.
I really do agree with the sentiment of this guy. I am this guy mentally but what he is saying is so painfully out of touch and completely ignores how people actually use the web today. My bro the web isnt controlled by corps because its to hard to host a web page. Corps have created these extremely far reaching and complex applications and people prefer to use those than browser through statically generated pages. Average man doesnt want to come home and update his page he just wants to open the app on his phone, scroll for a bit and close it.
The problem is in the environment but also the user behavior. Unless you can provide a convincing argument to change both by presenting an actual improvement then its farting in the wind
I am on board with basically everything this article is arguing, but I think it covers the easy part (that "people run their own servers" is the only solution to the problems caused by relying on giant ad corps to provide the server half of client/server software) and skips the hard part, which is the software they run.
Like, suppose some really good personal server software existed. Suppose there were an OS-plus-app-repository platform, akin to linux plus snapcraft, but aimed solely at people who want to host a blog or email server despite knowing nothing and being willing to learn nothing. It installs on to a raspberry pi as easy as Windows. It figures out how to NAT out of your cable modem for you. It does all the disk partitioning and apt-gets and chmods, you just open the companion app on your phone and hit the Wordpress button and presto, you've got a blog. You hit the Minecraft button and you've got your own minecraft server, without having to learn what "-Xms2G -Xmx6G" means. It updates itself automatically, runs server components in sandboxes so they can't compromise each other, and it's crack-proof enough that you can store your bitcoins on it. Etc, etc.
If that existed, we wouldn't have to write essays about freedom and so forth to get people to buy it, they'd buy it just because it's there. I mean, look at those digital picture frames - they cost more than a rasbpi and are way less useful, and half the people I know got or gave them for christmas. Why? Because they're neat and they cost less than a hundred bucks and they require no knowledge or effort. If a server that can host your blog were that easy, it'd get adopted too, and we'd be on a path to some kind of distributed social media FB replacement. Imagine the software you could write, if you were allowed to assume that every user had a server to host it on!
The problem is, that software doesn't exist and it's not clear how it would ever get made. It'd be a huge effort (possibly "Google building Android" sized) and the extant open source efforts along these lines lack traction, mostly due to the chicken-and-egg problem of any new platform that needs apps to be useful. And until it exists, any kind of neighborhood-internet-collective-power-to-the-people dream has to necessarily begin with hoping that millions of people will spontaneously decide to spend their precious free time doing systems administration.
Not to shit on a fine essay that I mostly agree with. It just seems like, without figuring out the software, this is daydreaming.
Very much enjoyed this. Always am shocked that my colleagues on the humanistic/writing studies side don't have a larger contingent actively contributing to web and publication technologies/specs, ceding so much of that space to folks with design backgrounds; they still don't really invest enough time into understanding networked writing
> Simple to use software that empowers us to both read and write hypertext4 and syndicated content
Simple to use software... this would be grand!
> Raspberry Pi OS (a Linux distribution based on Debian GNU Linux)
Is this simple? I would contend that it is not. Why do I tell people "buy apple products" as a matter of course? Because they have decent security, great ease of use, and support is an Apple Store away.
They still manage to screw things up.
Look at the emergence of docker as an install method for software on linux. We sing the praises of this as means of software distribution and installation... and yet it's functionally un-usable by normal (read: non technical) people.
I personally think the trend we witnessed with clawdbot where people ran to buy mac minis or other ways of self hosting ai agents is going to be a huge wind in the sails for generally hosting things at home.
I agree with owning the network devices, and lack of control here is a problem that still has solutions.
And self-hosting personal services makes sense and we're able to do that.
BUT, we don't own the connections. There's always going to be shared infrastructure for connecting these devices worldwide, and without an ideal state of Communism or utopian capitalism we're not going to own them or want to be responsible for them. Any kind of service that depends on a central database is not going to be communally owned.
Ownership is an economic problem, the technical aspect is merely interesting. Bitcoin might be a great example of this.
Okay, and that depends on an entire economy and infrastructure of privately owned switching, other network equipment, fiber optic, etc, etc, etc, -- not to mention that if GitHub did not have, as a private company, a profit motive, they wouldn't even bother to offer the service you're using.
Sure, yes, rebuild the world but if you want it to be free like open source, you'll also need to make it free like beer -- and that means you'll need to work for free, too.
I support the aim. I acknowledge the problems. I'm just so frustrated by these silly oversimplifications of how to solve it.
108 comments:
As someone who is roughly in the same age group as the author and who was running a BBS, has witnessed the rise of IP4 networks, HTTP, Mosaic etc. let me provide a counter-point.
The democratization ends at your router. Unless you are willing to lay down your own wires - which for legal reasons you most likely won't be able to do, we will hopelessly be dependent on the ISP. (Radio on free frequencies is possible and there are valiant attempts, they will ultimately remain niche and have severe bandwidth limitations)
For decades ISP have throttled upload speeds: they don't want you to run services over their lines. When DSL was around (I guess it still is) in Germany, there was a mandatory 24h disconnect. ISP control what you can see and how fast you can see it. They should be subject to heavy regulation to ensure a free internet.
The large networks, trans-atlantic, trans-pacific cables, all that stuff is beyond the control of individuals and even countries. If they don't like your HTTP(S) traffic, the rest of the world won't see it.
So what you can own is your local network. Using hardware that is free of back-doors and remote control. There's no guarantee for that. If you are being targeted even the Rasperry Pi you just ordered might be compromised. We should demand from our legislators that hardware like this is free of back-doors.
As to content creation: There are so so many tools available that allow non-technical users to write and publish. There's no crisis here other than picking the best tool for the job.
In short: there's no hope of getting a world-wide, free, uncensored, unlimited IP4/6 network back. We never had it in the first place.
> In short: there's no hope of getting a world-wide, free, uncensored, unlimited IP4/6 network back. We never had it in the first place.
We can build such a society. I am not sure why you think this is never possible.
People can work for a better world. That sometimes works, too.
Kumbaya is never a motivator. Now, self-interest, on the other hand...
Spot the American.
Capitalism is not the only way of life, and FYGM is a mental illness outside of the US
What is you preferred socioeconomic system? Any countries successfully implementing it so we can copy them?
Capitalism does not imply FYGM, nor does it benefit from it.
> People can work for a better world. That sometimes works, too.
Not when people make arguments based on dreams, hope, and optimism.
If somebody tells me that we can build a shed, I want them to talk about wood, nails and concrete, or to stop talking.
> We can build such a society. I am not sure why you think this is never possible.
Where does such informed political and economic interest and power exist? With whom do we construct such society? Do they have the power and will to fight for it?
Normies live with normie standards and with incresing social media exposure with more and more emotional animal-like manipulated world views. They are either ignorant or ambivalent.
Will tech people gather on a piece of land and declare independence? Most of my tech worker colleagues are also quite pro-social media and they heavily use it to boost their apparent social status. We cannot even trust our kind.
Similar examples of new technology being used to motivate and mobilize masses have always ended with devastating wars and genocides. Previously the speed of propagation of information gave advantages to statespeople like FDR to put an end to increasing racism/Nazism/violent tendencies (of course not everywhere, when let to its own devices new technology almost perfect for constructing dictatorships). Now everybody has equal access to misinformation.
Even in our fabulously wealthy societies, people are mostly worried about paying their bills, taking care of their families, and putting food on the table, not in getting together in a quixotic enterprise and paying for thousands of kilometers of communal fiber. Also like in most communist utopias what would probably happen is that the control infrastructure would be captured by special interest groups and now you’ve traded one evil for another, but in addition you’re left holding the capex bag and you’re poorer for it.
The technology is the easy part, the people are the hard part. The reality is that we simply don't have thought leaders in charge anymore, there's no innovations or anything that are coming to correct course, very few if any channels even exist anymore for good ideas to flow upwards that result in good & proper solution implementations that positively preserve/protect/harden what we want the web to be. I think a lot of bright minds who could be solutioning for some of these things understand the dynamics at play even if they've never taken a huge moment to think about it. Subconsciously they are aware that becoming a person to try and steer such a big ship would require a monumental exertion that is maybe not worth it anymore. The great leaders never actively seek out leadership positions, similarly I don't think the people who could be good decision makers and influence these types of ideals coming to fruition in society actively seek out such positions. The possible mental tax of getting there is probably enormous. It is not an economic win for anyone to take up the mantle of trying to steer ships this size, it is a massive sacrifice. People who would be fit for the task probably just want to sign off at the end of the day and... have a good life and exist/be a benefit in their communities. In some ways perhaps that makes them.. unfit for carrying this torch. Perhaps there are simply too few people out there that are adequately qualified to carry this torch, we are in dire need of competent people at the helm of many fronts and we simply don't have that, that's just the real life variables at play right now.
We plebs are just driftwood floating in massive waves of nation state decision making. I don't doubt there are people who literally work at ISPs who are depressed at the state of things, depressed that theyre not allowed to take action on certain things, depressed that they see first-hand what kind of control mechanisms they're forced to implement or disallowed from implementing and more. It's got to be a trove of BS in an age of misinformation which has always been an information systems problem that humanity has implemented checks notes zero solutions for. And at the end of the day they, probably like all of us, just want to live a good and meaningful life.
That's not to say just... give up on ideals. But instead to acknowledge the realities of ideals not being enough on their own. Have some real conversations on what it would actually take to embed these types of fundamentals into a society, get comfortable with the uncomfortable realities. So much work needs to be done before new ideals can even be shared. Outreach alone to spread ideals is a massive uphill battle at this point due to conglomerate control of broadcast media and concentrated ownership of social media apps. A lot of these particular ideals require a decent understanding/background of technology in general which most people don't even have, making these things an incredibly unlikely basis for a society where these things are well-enough understood. So the circus trick here is how do you make it a digestable topic that touches the souls of many and galvanizes them to take the correct stance so that these things become embodied in the set of ideals a society values, so that legislators and whatever other proxies that are tasked with decision making give these things the resourcing or policy making attention they deserve. That's the mega hard part, which is then additionally compounded in difficulty by ... most households in our societies just never having these types of discussions make it to their TV/computer screens. Hackernews types like to call these people "normies" and tack the blame on them, but they can't seem to wrap their mind around that not everyone could or should have a deep compsci background. We should be coexisting with people of a variety of backgrounds and instead we should be looking at their "normie"-ness as a thing to account for, not blame. It would be absurd to have a "normie" expect us to be exceptional at rebuilding car engines or any other broad subset of knowledge that we haven't ourselves committed our own lives/spare time to.
So that leaves the other route to take which is just... renegade fine-we'll-do-it-ourselves. Which can succeed, but has its own set of challenges. Fronting infrastructure for a lot of stuff is expensive, so donors are needed on sometimes vast scales. To another commenters point like... ain't none of us on the renegade front laying undersea cables any time soon which are multi-billion dollar projects to cross the Pacific. Often times we see these underground efforts fail in their infancy simply because the UX just flat out sucks and we're up against entities who can giga-scale all their infrastructure/resources & ultimately capitalize on making whatever app thing fast&pleasant for users. It feels like we're drowning against titans sometimes, it's overwhelming.
> The large networks, trans-atlantic, trans-pacific cables, all that stuff is beyond the control of individuals and even countries. If they don't like your HTTP(S) traffic, the rest of the world won't see it.
Not really having a plan here, so if nothing else this is out of curiosity, but I'd like to know who is actually owning that stuff.
For something that seems so ubiquitous and familiar like the internet, it would probably be good to understand who owns most of its infrastructure.
> there's no hope of getting a world-wide, free, uncensored, unlimited IP4/6 network back.
What do you mean "back"? It was never free, as in zero-cost. It was also not very unlimited; I remember times when I had to pay not only for the modem time online, but also for the kilobytes transferred. Uncensored, yes, because basically nobody cared, and the number of users was relatively minuscule.
The utopia was never in the past, and it remains in the future. I still think that staying irrelevant for large crowds and big money is key.
he literally said "we never had it in the first place"
> We should demand from our legislators that hardware like this is free of back-doors
In some countries that may be possible (if only for now). Where chips are produced makes that an impossibility for most. That is, you can have certain guarantees if you run the chip fab, although if you are downstream of that, it can be a tall order to guarantee your chips are sovereign. So, while I like the sentiment that you have some sort of control behind your router, I'm really unsure how true that is given the complexity of producing modern day chips. Disclaimer, not an expert, just an opinion.
I would even say unless you truly have full custody of the transportation of components as well, that is unlikely. Israel’s pager bombs in Lebanon were supplied via a third party, not the manufacturer.
To be fair with fibre to the home rolling out in more and more places upstream speeds are improving.
The upstream bandwidth sure improved but ISPs are still hostile to self-hosting by limiting ports, resetting connections every x days and not providing an ipv4 for a reasonable charge.
There's also no hope of creating a web that is resistant to enshittification and power consolidation as long as it can technically support any form of economic transaction.
> I publish this site via GitHub Pages service for public Internet access
A whole post about not needing big corporations to publish things online, and then they use Microsoft to publish this thing online...
I think the point the author is trying to make is more so about these mini networks on their own LAN, which their family uses. (And maybe dreaming of a neighbourhood utility LAN as a middle ground between LAN in your house and WAN as just a trunk to a big ISP node) The full quote is
There's a whole meme subgenre dedicated to this type of argument. Search for "Yet you participate in society, curious!"
Except there are a ton of alternatives to hosting on Github that are not owned by one of the large companies he's railing against.
Yes, but that person owns their website, its content, and the address it lives at. They can publish anything they want, in any format they want.
Hosting on GitHub is merely a convenience; they can up and leave anytime.
in this case, they're on a github.io subdomain, which they don't own.
For reals. I love the general premise behind the article, but to me how you publish it, and how others access it, is the sauce. Creating static sites is hardly the problem.
I remember a web when practically every ISP allowed you to have a "home page" hosted with them. Your home page was situated in the "public_html" directory of your home directory on their server — hence the name.
Then the URL was http://www.<hostname.domain>/~<username>
I haven't see an URL with a tilde ('~') in it in a long time.
Why did ISPs stop with this service? Was it to curb illegal file sharing?
I've seen a few around on HN actually! Though they tend to be university systems, or pages hosted on https://tilde.club/
The ironic thing is that each subscriber now has a dedicated computer many times more powerful than what ISPs had back in the day for hosting ALL those websites sitting in the most privileged part of their network, being online 24h and begging to be used for small hosting tasks like this: their ISP provided router. It even serves it's configuration panel through html and a webserver for crying out loud!
Unfortunately reality is such that those are closed systems with historically abhorrent security and ISPs usually forbids the user from properly providing their own choice of router.
i think Apache sets this up by default or use to. Every user on Linux would get a www, or maybe it was htdocs, folder in their home directory when they were added to the system. Any file you put there is served by Apache at resource /~<username> which was reading from /home/<username>/www on the file system.
There use to be lots and lots of ISPs and so they were small enough to have a single webserver with all their customers setup as users and Apache serving content. They'd also setup FTP on the same server so you could get your html files into your www folder. Software like Dreamweaver had a ftp client built in, so you'd click like a "publish" button and it wold login to FTP and transfer your files.
i would imagine this went away because it got expensive as the customer base grew and ISPs consolidated and it made no money. Other options with php, mysql, and other services cropped up and could offer more and charge for it so I think ISPs just preferred to concentrate on network access and not hosting websites.
Apache doesn't have it on by default but easy to turn it on. It's called usermod or mod_user. By default it's the ~/www directory. So, anyone with /home/<name>/www ends up being site.url/~<name>/
It is also possible to add .htaccess and other things there, like username/password challenge (WWW-Authenticate) into that on per-user basis.
Mostly universities had hosting setup the same way. ISPs would also offer a similar thing with an additional fee to your internet-subscription. They mostly provided FTP to upload files. Nowadays if anyone tries to, it will be a SFTP rather than FTP.
Universities still have this--well, ok, at least for Faculty :)
https://www.cs.cmu.edu/~btitzer/riff.html
I actually haven't had a homepage for a long time because of the lack of the easy "put my home directory on the web", but I'd like to go back now to doing that.
Think it had more to do with the consolidation of the ISP space.
I used to have my choice of dozens of ISP's. Now if I am lucky I might have 2 or 3 from very large companies that did the math on keeping that going. It mostly happened when ADSL and cable took over. In most areas that meant only 2 or 3 companies could actually provide anything at speeds their customers wanted. Think at the time they always said it was cost cutting.
Likely demand dropped and when the infra hosting it was needing replacement it just never got replaced
As the author of a content management system I made with the idea to democratize internet content creation, I've had a lot of the same thoughts that the author brings up here. I've always thought that even learning Markdown was a bridge to far when it comes to empowering non-technical users however. In my experience it's best just to supply tooling similar to Word where you have buttons for things like lists and bolding. Using Markdown as the format itself is something I will agree with though.
Another thought I had is that local AI could most definitely play a part in helping non-technical users create the kind of content they want. If your CMS gives you a GPT-like chat window that allows a non-technical user to restyle the page as they like, or do things like make mass edits - then I think that is something that could help some of the issues mentioned here.
> supply tooling similar to Word
just fyi, Word still has Save As -> Web Page (.htm). For a blog post or newsletter type thing i bet it works just fine.
I wonder if some part of the FrontPage [0] codebase lives on in there...
[0] https://en.wikipedia.org/wiki/Microsoft_FrontPage
https://openlivewriter.com/ lives on
It's definitely an approach. I do think in true democratization of the internet, teaching people some tech is inevitable. We just can't have equal access if we retain the classes of user and maker as completely distinct.
The real barrier was never technical. It was convenience and discovery. Running a Pi at home is trivial for anyone on HN, but the moment you want people to actually find your stuff, you need DNS, a stable IP, and some way to not get buried under the noise.
Tailscale and similar overlay networks have made the "accessible from anywhere" part way easier than it used to be. The missing piece is still discovery. RSS was the closest we got to decentralized discovery, and we collectively let it rot. Maybe it's time to bring it back properly.
> The missing piece is still discovery.
I think the key issue here is that Attention is a temporal construct, meaning discovery is often tied to "being the first thing that comes to people's minds" which means SEO, reverse engineering the ranking algorithms, and constantly having to manage an "online persona". Note none of those things contribute to the actual work you're doing, just your "marketing department" (and whatever time/financial "budget" you intend to give it).
MrBeast figured out the YouTube algorithm - post early and often. Is that how we exist on modern Internet when every website/thumbnail is engineered by a team to maximize clickthrough rates? I agree RSS is useful, but it faces the same scalability issues if everyone starts filling up your RSS feeds. Given the limited amount of time you can devote to a particular task, we'll return to the era of A/B testing Headlines.
What does “bringing (RSS) back properly” entail in your eyes?
It’s still alive. Many sites still use it. Many people still subscribe to those sites. RSS reader apps are still being created to this day.
RSS directories perhaps, instead of relying on search engines.
I enjoyed the article, but I’m skeptical of the “democratize via hardware + networking” path. Most people won’t run a Pi, manage updates/backups, or debug home networking, and that’s fine (as you note).
But I do think we’re reaching a turning point on the software side. The barrier to building custom, personalized apps is trending toward 0. I’m not naive enough to think every grandma will suddenly start asking ChatGPT to “build me an app to do XYZ,” but with the right UX it can be implicit. Imagine you tell an assistant: “My doctor says my blood sugar is high. Research tips to reduce it.” -> it not only replies with tips, it also proactively builds a custom app (that you own and control) for tracking your blood sugar (measurements, meals, reminders, charts, etc.). You can edit it by describing changes (“add a weekly trend graph,” “don’t nag me after 8pm,” etc.).
This doesn’t fully solve your Big Co control issue (they own the flagship models today), but open-weight + local options keep improving. I'm hopeful we have a chance to tip the scales back toward co-owner and participant.
This post resonated with me. I have tried self hosting multiple times over the years and always gave up cause it is so hard to manage and there was always an online service that was good enough.
This weekend I vibe coded (dont shoot me) a homelab platform that hosts a bunch of useful services on a MacMini and lets me deploy my own apps on top of. Using tailscale I can access the apps from my phone. I have multiple users with their own SSO to control access. I even have a pi as part of the network that hosts public facing content. All done with Claude Code and OpenClaw (as a kind of devops tool)... hardly any code written by me. Its been a seriously fun experiment that I will try to progress some how.... if only because I love the dream "Digital sovereignty" even if the reality is its unlikely to happen again. It got me thinking though if I could get inference hardware and a good enough open LLM to work with my setup it might just be possible. The OP advocates a form of basic computing that is understandable but when we are able to host our own LLM's we could end up in a very different but more capable paradigm.
The repo for homelab anyone who has an interest: https://github.com/briancunningham6/homelab
> You can edit it by describing changes
Even this is hard. Most people don't know what they want, and/or they don't know how to describe it/imagine it. They don't even know what a trend graph is.
They just want someone else to do the mental effort of creating a nice product. Hence iOS > android for most people. They don't want to customise basically anything other than colours.
That's why i predict Lovable/replit etc will not go mainstream. And why chatgpt will just offer you their UIs mainly. Artifacts weren't a big hit
If you've never worked in a call center, or in a technical support role, it can be hard to understand just how inarticulate people are on average. Even programmers.
...And how much brainpower goes into understanding what people like this are getting at when they speak about things. There's a lot of context and human element to this; I'm skeptical AI will be any good at it in the near future.
Truly democratizing the web requires that "Compute Server" becomes a typical home appliance that is no more difficult to use than an oven or a furnace including the widespread access to vocational technicians who come to your house and fix it for you.
I mean -- my (completely non-technical) mother, after a few hours of my guidance, has started vibe-coding apps and websites for her local community organizations. And, like -- it works.
"it also proactively builds a custom app"
Does it deploy it as well?
I am in the process of co-founding a new protocol which creates a decentralized root of trust using normal plain-text names (i.e. `foo.bar`). One of the goals I hope to obtain is allowing domain-style lookups of private websites hosted on P2P networks. It's lofty, but the dialog used by OP is _very_ close to why I think it's necessary.
I've been interested in doing something similar in the past, but I could really never solve issues like domain squatting and stopping individuals from claiming every name possible. Do you have a place where you keep these plans or have discussions around it? Or even just a place where I could get updates if anything does come of it?
> I've been interested in doing something similar in the past, but I could really never solve issues like domain squatting and stopping individuals from claiming every name possible
I think that's just a property of a naming system. Without something like a centralized threat of force that can know every person participating in the system - there really is no recourse. The approach we are taking is making it difficult to create a speculative market around names which seems to be the driving force behind squatters.
Happy to discuss it in more detail: hackernews@sepositus.com
(Note: that's an alias that goes to my email address which I avoid putting in public places for obvious reasons).
Have you considered improving ygddrasil or something similar.
Yes, the protocol is going to be open and the plan is to submit PRs to various projects when we're at that point. It's not really an "either or" with ygddrasil but more like a "both and."
It's not about ease of publishing. The issue is what people get in return for publishing. Until you can design a platform that gives top creators as much money+attention as commercial platforms, you'll see a drain of top creators and their viewers to commercial platforms.
100%. You don't even need to give people money. It's about attention and feedback.
People post photos on Instagram and status updates on Facebook because their friends will see it there and give it a thumbs up.
A couple of decades ago, I spent a lot of time laboriously building a website for scratch for my photography. It was objectively a really nice site. I had my own domain, hosted it on a VPS, and put a ton of work into the layout and design.
But none of my friends ever thought to go there. I could see by my web stats that every now and then a random stranger would find the site... but they had no easy way of connecting with me and acknowledging that they saw it. If they put a lot of effort in, they could find my email address and email, but that's a hell of a lot harder than just clicking a little thumbs up button next to a Facebook post or filling a comment in the comment box.
Uploading photos to my site was about as rewarding as printing them out and throwing them in the trash. I thought about adding support for that to my site, but then it opens the whole can of worms around user-generated content, abuse, moderation, etc.
Eventually, I moved to Flickr, which at the time was an actual community that gave me that connection. Then Flickr fizzled out. Now, on the rare times I bother to process a photo... I just upload it to Facebook because that's where (a dwindling subset of) my friends are.
It's not about the content. It's about the human connection. A CMS won't fix that.
>It's about attention and feedback.
Feedback maybe, but blogging didn't start for attention. That's something that got bolted on by a nasty virus we as humans tend to be carriers of. I don't think feedback was even an inspiration for the initial bloggers.
No, I don't think that's true. I was active in the early blogging days and writing blogs as a response to other blogs was a really common pattern and part of the way the community functioned. It was sort of one big distributed conversation.
Certainly, it's fundamental to human nature that if we work hard to create something, we want some way to tell that another human was moved by it.
The only way we own a web of our own is to develop much more of a culture of leaving smallish machines online all the time. Imagine something like Tor or BitTorrent, but everyone has a very simple way of running their own node for content hosting.
That always-on device? To get critical mass, instead of just the nerds, you'd need it to ship with devices which are always-on, like routers/gateways, smart TV's. Then you're back to being at the mercy of centralized companies who also don't love patching their security vulnerabilities.
This is very right. There are two obstacles.
(1) Security. An always-on, externally accessible device will always be a target for breaking in. You want the device to be bulletproof, and to have defense in depth, so that breaking into one service does not affect anything else. Something like Proxmox that works on low-end hardware and is as easy to administer as a mobile phone would do. We are somehow far from this yet. A very limited thing like a static site may be made both easy and bulletproof though.
(2) Connectivity providers should allow that. Most home routers don't get a static IP, or even a globally routable IPv4 at all. Or even a stable IPv6. This complicates the DNS setup, and without DNS such resources are basically invisible.
From the pure resilience POV, it seems more important to keep control of your domain, and have an automated way to deploy your site / app on whatever new host, which is regularly tested. Then use free or cheap DNS and VM hosting of convenience. It takes some technical chops, but can likely be simplified and made relatively error-proof with a concerted effort.
Both or those are solved by having a tunnel and a cache that is hosted in the cloud. Something like tailscale or cloudflare provides this pretty much out of the box, but wireguard + nginx on a cheap VPS would accomplish much the same if you are serious about avoiding the big guys.
If you already pay for a cheap VPS, why not host the whole thing there? It's the simple Web. (As has been noted in comments elsewhere.)
if only we all had a little device that was always on and and connected….
If I'm reading the implication right, you're having a pretty terrible idea. Glossing over what running a server would do to your battery, it would never work because of the routing issues you'll run into.
With IPv6 it would theoretically be possible, but currently with ipv4 and NATs everywhere, your website would almost never be reachable, even with fancy workarounds like dynDNS
The irony of this being fully hosted on GitHub should not be lost. A toaster is sufficient to host a mostly static site, a VPS would be far more than sufficient.
GitHub is free, a VPS isn't.
Owning things isn't free (and a VPS isn't owning things, either)
I absolutely agree with the concept, but people have to be ready to do their own work rather than delegating it to other parties. Consolidation has happened because these massive conglomerates absorb operational complexity on the cheap, and that's attractive. Moving away from them means we take on the responsibility of doing it ourselves.
You know what they say about the kind of services FAANG gives away for free...
And yes, I get the practicality of it. However, when people are actually doing shit like this[1] in the real world, writers of manifestos might consider practicing what they preach a tad more.
[1]: https://solar.lowtechmagazine.com
You think you "own" a VPS?
Well, you pay for it at least, and hence enter into a contract with the service provider. A free GitHub account is a come-on by Microsoft to enmesh you further in the world of hosted services - the precise thing this manifesto is complaining about
The cynic in me wants to say that most of the web these days is pushing H.264 frames from a CDN to proprietary phone apps and the rest is pushing Widevine video from the same CDN to proprietary browsers and we'll never cooperatively own any of that, even if we wanted to.
The idealist in me says we should still build a simple to use publishing and discovery system for hypertext that can be self-hosted and self-networked for the day the next generations realize they need it (authoritarian control of the Internet, collapse of social media, infrastructure instability, climate apocalypse, whatever). I suppose my idealism is still pretty pessimistic, but then it is Monday.
I kind of resonate with a lot of things in the article. My own personal view is that we should make hosting stuff vastly simpler; that's one of the goals of my project, at least my attempt (self promo)
https://github.com/blue-monads/potatoverse
Potatoverse is a great name :)) BTW do you remember Sandstorm.io?
Thanks cap'n-py. Yeah, I love Sandstorm. My goal is to be more portable, lighter, and a 'download binary and run' kind of tool. There are also other attempts around what I call the 'packaging with Docker' approach (Coolify, etc.), which are more attempts at packaging existing apps. But my approach—the platform—gives a bunch of stuff you can use to make apps faster, but you have to bend to its idiosyncrasies. In turn, you do not need a beefy home lab to run it (not everyone is a tinkerer). It's more focused, so it will be easier for the end user running it than for the developer.
I think the main issue with federated apps is the identity and moderation. Without identity verification is hard to moderate so you end up with closed systems where some big CO does the moderation at an acceptable level
This is only half a thought.
The current wave of AI agents is diminishing the value of identity as a DDOS or content-moderation signal. The formula until now included bot = bad, but unless your service wants to exclude everyone using OpenClaw and friends, that's no longer a valid heuristic.
If identity is no longer a strong signal, then the internet must move away from CAPTCHAs and logins and reputation, and focus more on the proposed content or action instead. Which might not be so bad. After all, if I read a thought-provoking, original, enriching comment on HN, do I really care if it was actually written by a dog?
We might finally be getting close to https://xkcd.com/810/.
One more half thought: what if the solution to the Sybil problem is deciding that it's not a problem? Go ahead and spin up your bot network, join the party. If we can design systems that assign zero value to uniqueness and require originality or creativity for a contribution to matter, then successful Sybil "attacks" are no longer attacks, but free work donated by the attacker.
> if I read a thought-provoking, original, enriching comment on HN, do I really care if it was actually written by a dog?
I would rather just read the thought as it was originally expressed by a human somewhere in the AI's training data, rather than a version of it that's been laundered through AI and deployed according to the separate, hidden intent of the AI's operator.
Unfortunately the transparency of the IP stack means that unless u want whole world to know where u live via one DNS query, you'd need to use a service to proxy back to urself. And if ur paying for remote compute anyways, you could probably just host ur stuff there. Any machine that can proxy traffic back to you is just as capable of hosting ur static stuff there.
It only gives a pretty rough estimation, not a street address. I don't think many self-hosters have run into issue w/ this.
This guy has been around long enough to know about NNTP, which is the original distributed people-focused web, but talks about how HTML is some kind of barrier to entry.
HTTP requires always-on + always-discoverable infrastructure
It's all over the place.
I really like this model for individual services.
The challenge I've always felt, is shared services -- if I'm running infra myself, I can depend upon it, but if someone else is running it, I'm never really sure if I can, which makes external services really hard to rely on and invest into.
Maybe you can get further than expected with individual services? But shared services at some point seem really useful.
I think web2 solved that in an unfortunate way, where you know the corporations operating the services / networks are aligned in some ways but not in others.
But would be great to have shared services that do have better guarantees. Disclaimer, we're working on something in that direction, but really curious what others have seen or thinking in this area.
We got here iteratively..not all at once. So the path back...it's iterative. I shouldn't even say back. We're not going back. We have to go in a new direction. And again it's evolutionary. So ultimately a lot of these big systems and big tech companies aren't going anywhere and they will be integral to all infrastructure for the foreseeable future whether that be technical, financial or related to public services. But as individuals we can slowly shift some of our efforts elsewhere in ways that it might matter.
Here's my small contribution to that. https://github.com/micro/mu - an app platform without ads, algorithms or tracking.
I agree with the point that big companies have persuaded people that only they can offer ease of publishing content. most of my friends publish on Facebook, X, Instagram etc.
I have tried to get them to publish markdown sites using GitHub pages, but the pain of having to git commit and do it via desktop was the blocker.
So I recently made them a mobile app called JekyllPress [0] with which they can publish their posts similar to WordPress mobile app. And now a bunch of them regularly publish on GitHub pages. I think with more tools to simplify the publishing process, more people will start using GitHub pages (my app still requires some painful onboarding like creating a repo, enabling GitHub pages and getting PAT, no oAuth as I don't have any server).
[0] https://www.gapp.in/projects/jekyllpress/
Isn’t publishing on Github Pages still posting to a corporate centrally owned entity and not a solution to the problem described?
But it is portable. It is essentially markdown files. You can download your repo, compile the Jekyll to static pages and publish them anywhere.
When you publish to Facebook, WordPress etc you can't easily get your stuff out. You will have to process them even if they allow you to download your content as a zip folder. The images will be broken. Links between pages won't work etc.
Facebook provides a data export service which gives you a zip file with a web version of all your content. I’m not sure what the difference is then between that and a Github hosted repository of all your content as a webpage.
The main difference is the data structure and the intent of the export. Facebook's tool is built for data compliance and local offline viewing, not web portability. If you open that Facebook zip file, the HTML version is just a massive dump of proprietary markup. To actually migrate those posts to a new blog, you'd have to write a custom scraper just to extract your own text from their messy div tags. If you use their JSON export, you still have to write a custom script to parse their specific schema and remap all the hardcoded local image paths so they work on a live server. With a Github Pages repo, your content is already sitting there as raw, standardized Markdown. You can just take that folder of .md files, drop it into Hugo, 11ty, or any other static site generator, and it just works. No scraping or data-wrangling required.
This is all fine and dandy for websites but what we’ve really been locked out of is email.
You can’t run your own email server. All other large email providers will consider your self hosted emails as spam by default. It understandable why they took this stance (due to actual spam) but it is also awfully convenient it also increases their market power.
We are now at the whim of large corps even if we get a custom domain with them.
I think this mostly misses the biggest reason why writers would choose big tech platforms or other big platforms: discovery and aggregation. If you want to speak to be heard and not just for its own sake, then you want to go where the people are hanging out and where they could actually find your content.
This is like talking about how book authors don't need Amazon when you have a printer and glue at home.
I welcome everyone who wants to imagine a better web. One with less control by greedy mega-corporations and data-sniffing state actors. I am not sure how such a web should look, but I am pretty certain we will need it sooner or later. Otherwise we may end up with "hey, we were in the 1990s generation, we knew a free web - now this has been replaced by walled corporate gardens controlled by a few superrich".
> Tiny computers are like tiny homes
They totally suck like tiny homes? No, actually they are better than tiny homes. Browser are the #1 reason why you want a computer that's better than a Pi 500. Wanting to play modern games is #2.
Co-ownership of the hardware is a social not technical problem. Think of questions of trust, responsibility, who has power, who and how contributes, how decisions are made, etc, etc
one note. Even when we all wrote html most people still used a bigco to host it. Maybe if you go back far enough you can say you don't think regional ISPs are "big" but those companies are all gone now
I don't wanna brag but this is pretty much the premise behind my soon-to-be-published scifi novel:
https://inkican.com/mesh-middle-grade-scifi-thriller/
A web we own better mobile responsive.
It’s not really covered, but p2p technology combined with every phone in the world (and a little wishful thinking) could make for some neat applications.
I mean, you do have a point, and I'll quite agree with it. The only way of monetizing your writing is to use Substack or Medium, or whatever.
Yet your approach is appallingly low on the other side of the spectrum. I've been in IT for the past 25 years. I have yet to see a non-IT person who knows what dedicated IP is. If you are not publishing it on the internet, then what's the point?
I've seen plenty of companies where the owner just had a read-only shared drive, where people can rummage thru a pack of PDFs. This' was all fine with that.
You have to understand, manage and work with the complexities of the tools, and offer tools quite enough for the task. It's alright to offer what you do to an engineer who has a spare Pi and a couple of days to kill. But it's quite useless for anyone else to adopt.
I really do agree with the sentiment of this guy. I am this guy mentally but what he is saying is so painfully out of touch and completely ignores how people actually use the web today. My bro the web isnt controlled by corps because its to hard to host a web page. Corps have created these extremely far reaching and complex applications and people prefer to use those than browser through statically generated pages. Average man doesnt want to come home and update his page he just wants to open the app on his phone, scroll for a bit and close it.
The problem is in the environment but also the user behavior. Unless you can provide a convincing argument to change both by presenting an actual improvement then its farting in the wind
I am on board with basically everything this article is arguing, but I think it covers the easy part (that "people run their own servers" is the only solution to the problems caused by relying on giant ad corps to provide the server half of client/server software) and skips the hard part, which is the software they run.
Like, suppose some really good personal server software existed. Suppose there were an OS-plus-app-repository platform, akin to linux plus snapcraft, but aimed solely at people who want to host a blog or email server despite knowing nothing and being willing to learn nothing. It installs on to a raspberry pi as easy as Windows. It figures out how to NAT out of your cable modem for you. It does all the disk partitioning and apt-gets and chmods, you just open the companion app on your phone and hit the Wordpress button and presto, you've got a blog. You hit the Minecraft button and you've got your own minecraft server, without having to learn what "-Xms2G -Xmx6G" means. It updates itself automatically, runs server components in sandboxes so they can't compromise each other, and it's crack-proof enough that you can store your bitcoins on it. Etc, etc.
If that existed, we wouldn't have to write essays about freedom and so forth to get people to buy it, they'd buy it just because it's there. I mean, look at those digital picture frames - they cost more than a rasbpi and are way less useful, and half the people I know got or gave them for christmas. Why? Because they're neat and they cost less than a hundred bucks and they require no knowledge or effort. If a server that can host your blog were that easy, it'd get adopted too, and we'd be on a path to some kind of distributed social media FB replacement. Imagine the software you could write, if you were allowed to assume that every user had a server to host it on!
The problem is, that software doesn't exist and it's not clear how it would ever get made. It'd be a huge effort (possibly "Google building Android" sized) and the extant open source efforts along these lines lack traction, mostly due to the chicken-and-egg problem of any new platform that needs apps to be useful. And until it exists, any kind of neighborhood-internet-collective-power-to-the-people dream has to necessarily begin with hoping that millions of people will spontaneously decide to spend their precious free time doing systems administration.
Not to shit on a fine essay that I mostly agree with. It just seems like, without figuring out the software, this is daydreaming.
I like how it's not mobile friendly.
Very much enjoyed this. Always am shocked that my colleagues on the humanistic/writing studies side don't have a larger contingent actively contributing to web and publication technologies/specs, ceding so much of that space to folks with design backgrounds; they still don't really invest enough time into understanding networked writing
> Simple to use software that empowers us to both read and write hypertext4 and syndicated content
Simple to use software... this would be grand!
> Raspberry Pi OS (a Linux distribution based on Debian GNU Linux)
Is this simple? I would contend that it is not. Why do I tell people "buy apple products" as a matter of course? Because they have decent security, great ease of use, and support is an Apple Store away.
They still manage to screw things up.
Look at the emergence of docker as an install method for software on linux. We sing the praises of this as means of software distribution and installation... and yet it's functionally un-usable by normal (read: non technical) people.
Usability needs to make a comeback.
> great ease of use
Apple stuff is a nightmare of dark patterns and user-hostile idiocy.
Maybe it's easy if you have Stockholm syndrome and have internalized all the arcane gestures, icons and bug avoidance patterns.
The average normie has no clue, though. (This is borne from experience, I have like 8 iPhones in the immediate family among children and seniors.)
I personally think the trend we witnessed with clawdbot where people ran to buy mac minis or other ways of self hosting ai agents is going to be a huge wind in the sails for generally hosting things at home.
I agree with owning the network devices, and lack of control here is a problem that still has solutions.
And self-hosting personal services makes sense and we're able to do that.
BUT, we don't own the connections. There's always going to be shared infrastructure for connecting these devices worldwide, and without an ideal state of Communism or utopian capitalism we're not going to own them or want to be responsible for them. Any kind of service that depends on a central database is not going to be communally owned.
Ownership is an economic problem, the technical aspect is merely interesting. Bitcoin might be a great example of this.
Bro just found fediverse
This is silly nonsense.
>I publish this site via GitHub Pages
Okay, and that depends on an entire economy and infrastructure of privately owned switching, other network equipment, fiber optic, etc, etc, etc, -- not to mention that if GitHub did not have, as a private company, a profit motive, they wouldn't even bother to offer the service you're using.
Sure, yes, rebuild the world but if you want it to be free like open source, you'll also need to make it free like beer -- and that means you'll need to work for free, too.
I support the aim. I acknowledge the problems. I'm just so frustrated by these silly oversimplifications of how to solve it.
Who is this we, kemosabe?