Wack Playstation Sup! ๐Ÿ™Š ๐Ÿ‡ฎ๐Ÿ‡ธ ๐Ÿ is a user on mastodon.xyz. You can follow them or interact with them if you have an account anywhere in the fediverse. If you don't, you can sign up here.
Wack Playstation Sup! ๐Ÿ™Š ๐Ÿ‡ฎ๐Ÿ‡ธ ๐Ÿ @HerraBRE

One of the paradoxes I struggle with in my work, is the conflict between crypto and reliability.

Crypto is important. But it is very binary in nature - either the stars align and you can decrypt, or it fails and there's no recovery. With that kind of binary, reliability suffers. This is inevitable.

As an example, most of the Mastodon downtime I've experienced has been related to minor SSL certificate blunders.

I feel like most of the community wilfully ignores this dynamic.

ยท Web ยท 6 ยท 17

I think it's really interesting to follow Dave Winer (inventor of RSS) on Twitter - he's very concerned about the current push towards HTTPS.

He's afraid raising the secrity bar will make the web less open and less accessible. And he's right; adding technical requirements favours the entrenched big players with big budgets.

Dave also fears for the historic web, in the (unlikely?) event that browser vendors actually deprecate HTTP.

I don't agree with everything he says, but the POV has value.

We've actually seen this dynamic before - in e-mail.

Because of spam, the weird technical hoops you had to jump through to run a mail server kept getting more and more insane.

And at the same time, people demand more and more functionality, fancy web interfaces, high availability.

Fast forwared to today: even big universities and governments with dedicated IT departments have given up and pointed their MX records at GMail.

Is the web headed the same way?

@HerraBRE there's not much pushing in the other direction imo. just Building A Thing is 20x more complicated than it used to be

@aeonofdiscord Yeah.

I'm really conflicted about this, because part of the reason things were easier before is because they were ridiculously insecure - that's not OK in engineering or food or transport... and it shouldn't be in IT either.

But openness and low barriers to entry have great value too, I don't want us to lose those.

@HerraBRE The pressure (as of right now) for HTTPS seems less severe than the pressure to centralize e-mail, but itโ€™s still there.

@HerraBRE
I don't think so. Crypto will become streamlined and it will become part of the APIs.

The real problem IMO was governments' resistance to encryption which only make it so difficult to implement.

E-mail is a very different problem: It assumed there were no hostile agents in the network. It was based on trust. The protocol is completely broken: anyone can put trash on your inbox FOR FREE. Change the protocol so that sending costs money, even a cent, and you'll fix spam forever.

@HerraBRE
As for the web, I see a different trend: People are abandoning adware in favor of community driven platforms (just look where you're posting). And with things like Hubzilla, and coming up, Plume, well have decentralized websites, blogs, zines, all supported by the community.

The Facebook backlash showed us a new future: a decentralized, even peer to peer web. It's no longer webs spun by giant spiders to catch all; it's tunnels built by ants to communicate. This is just the beginning.

@HerraBRE the daft thing is that servers going down is something that's actually allowed for in email. (If the host you're sending to is down, your SMTP server will usually retry over the course of a few days before it finally gives up)

"Availability" only became an issue when people started using browser-based email instead of local email client software.

@HerraBRE I haven't acted as a closed relay for years -- but I did operate that way for years before. It's also the huge spam load as well, of course.

Heck, even with jumping through all the hoops I can barely get my cronjobs and logs off without running afoul of mr google. He'll suddenly start bouncing them for a few days and then silently go back to accepting my mails. Why??? wtf knows.

@gemlog @HerraBRE

Has anyone mentioned Let's Encrypt?

I haven't had much trouble setting it up, and it runs unattended.

It's still hard to protect from DOS attacks, but I've coasted by doing nothing since my site isn't popular enough to take down.

I do fear for email, and I wonder why there isn't something like fediverse for email, but written in something easy like Go or Rust, where a noob admin can set up a sane server in 10 minutes and start accepting email.

Spam may always be hard.

@golf_oil @gemlog The sites where I've recently experienced SSL related downtime were almost all using Let's Encrypt certs.

Add a moving part, you've added another thing that can break. If that part is authenticated crypto, that means your site is down.

Again, I'm not saying people shouldn't do it. But in my experience it DOES make things less reliable.

Re: e-mail, see mailinabox.email/. The reason it fails, is the Internet rejecting mail as Spam. No simple server kit can avoid that.

@herrabre I'm self-hosting email for years now, it's working out fine :-).
@herrabre Are reverse DNS (PTR), DKIM, all that all set?

@HerraBRE yeah - i've advocated https-all-the-time kind of reflexively for ages, but there's merit to at least some of the qualms that people have.

a _lot_ of the merit arises from the simple fact that https is a garbage design in many important ways and _absolutely terrible_ to work with for just about everyone, from novices with a static blog to full time pros with complex infrastructure.

(and yes, this remains true in the era of letsencrypt.)

@HerraBRE The web is getting a lot more complex as a long running trend, universal https one of the smallest parts of that complexity and one of the most defensible imo.

Big players add a lot of non-security related requirements.

HTTPS is actually one of those things that is getting slightly less complex over time, with tooling support increasing for the basic usecase.

@herrabre I feel bad about how the push towards HTTPS puts Certificate Authorities in a position where we have to ask them for permission to host some random website.

Correction: I need to remember not to just credit people with inventing stuff. Dave was one of the pioneers of RSS, not THE inventor. My bad, sorry.

This Chrome security announcement is one I have mixed feelings about:

blog.chromium.org/2018/05/evol

On the one hand, this is the right thing for security and protecting the average web surfer's privacy.

On the other, I can totally see why people are concerned about the future of the legacy clear-text web of HTTP URLs and pages (see thread).

@HerraBRE Yeah, so when my 'secure' blog shows the http inline images, users can get a scary and alarming warning about a page with insecure elements.

@HerraBRE My web hosting doesn't reasonably support cheap SSL. Thankfully I have little reason to deploy it, since I don't give a half a crud how Google ranks my websites.

@HerraBRE MITMing most web traffic is pretty uninteresting, encrypting it is good, sure, but the cost is high, especially since you're now dependent on a centralized list of CAs.

Whoever thought requiring a CA for encrypting traffic was cool should be publicly shamed for all eternity.

@ocdtrekkie You're missing a threat which is actually common in the wild: MITMing to inject crapware.

ISPs do this, this isn't hypothetical.

Also, if you believe people should be able to surf anonymously and want Tor users to have access, consider that it's super easy to spin up a malicious exit node that corrupts traffic.

Securing your sites with TLS protects your visitors' from that sort of thing, which makes it worth doing almost no matter what sort of content your provide.

@HerraBRE I try to avoid ISPs when inject junk. Arguably, if the ability to inject junk is part of your agreement with them (and one would hope it is factored into the price/value equation), they should be able to in nonsecure contexts.

I'd be happy to jump on the encryption bandwagon, that being said, if CAs weren't involved. They've been proven untrustworthy over and over again. The fact that we have people trying to push a *mandate* that we deal with them is borderline insane.

@HerraBRE (Re: ISPs that inject junk, NetZero was an amazing thing to exist back i nthe day.)

@ocdtrekkie Again, it's not about you. It's about your users.

People don't know about these terms and they don't know the implications. And they may have no choice, not all areas have competing ISPs.

Anyway, such EULAs are problematic for a bajillion reasons, I'm surprised you'd use them as justification for anything!

You can shrug and say their ISP is not your problem. I tend to err on the side of saying we have a duty of care towards our users, but people can disagree with me on that. ๐Ÿ˜

@HerraBRE @ocdtrekkie Also, people on such ISPs (including many national mobile providers, I understand) maybe won't be sophisticated enough to distinguish between what's actually on your site and what the ISP has injected. They'll just see that your site has silly ads or whatever.

@HerraBRE @ocdtrekkie

I can confirm, my ISP is known for browser hijacking

@ocdtrekkie @HerraBRE you should change your webhosting provider. in times of letsencrypt it's more than bad to not support "easy ssl".

@ninjafoss @HerraBRE You'd be surprised how many web hosts don't. Shared hosting is pretty darn common.

In my case, I haven't found a suitable replacement: I won't buy either domains or hosting from a company that doesn't have 24/7 US-based phone support, which is a rapidly decreasing commodity these days.

@edavies @ninjafoss @HerraBRE No, but since you don't have root or often even shell access on shared hosting, you're at the whims of your host's offerings.

You can't run something like a Let's Encrypt certbot, for instance, unless the host sets it up for you.

@ocdtrekkie @ninjafoss @HerraBRE Yes, need to pick your host carefully, which can indeed be difficult.

Had my domain registered with my current hosting provider while my hosting was elsewhere until previous hosting provider became untenable at which point current one was really the only acceptable option - annoying as I'd prefer to keep the domain and hosting separate.

But the point: SSL hosting isn't intrinsically difficult or expensive, just a matter of what the market provides.

@edavies @ocdtrekkie @ninjafoss I wouldn't be at all surprised if a bunch of hosting providers were still pretending Lets Encrypt doesn't exist and using SSL as a differentiator for "premium" hosting plans.

That'll slow down adoption at the lower end of the hosting market.

@HerraBRE @edavies @ninjafoss I think in my host's case, they just provide a fairly dated version of cPanel/WHM which isn't yet EOL. I believe newer cPanels support Let's Encrypt out of the box, so presumably once they have to upgrade, it'll support it.

@edavies @ninjafoss @HerraBRE Keeping domain and web hosting separate is an absolute must. Same for email service as well. Three different companies control this aspect of my online presence, which makes losing any one of them at a time fairly recoverable as a condition.

@HerraBRE He's been for decentralisation for possily decades now. However he remains a liberal capitalist and he's exclusively using Twitter and Medium.

He's right about unnecessary complexity. But you can't make progress without improving the web on things such as #ActivityPub. The discussion shouldn't be whether we want a simpler web (which we do) but whether we want a more decentralised web.

@HerraBRE Making the web more decentralised involves complex work, oversimplifying everything is not a solution. That's how improving things works. We just need to decide which parts of the web we want to leave behind meanwhile we build what's new. We're changing wheels whilst driving on the driveway.

Part of the web some of us want to leave behind is the web that enables the existence of Facebook, Google, Twitter.

@HerraBRE Arming ourselves with true substitutes for those is going to take a lot of work, and adding new technology that is not currently available for everybody, largely because Facebook, Google, and Twitter have suppressed it, or hindered its development.
Twitter used to publish RSS feeds for every stream in the early days.

But they also had something similar to a PubSub api in the early days, and other things we can do so much better than they did.

Oversimplification is never the answer.

@HerraBRE is it that hard though? Most websites are fairly small and don't require a difficult infrastructure which would prevent the use of HTTPS.
It's like one click away on your cPanel (or whatever you have) for most hosting service. And Let'sEncrypt works very well.
If the website ia big or the infrastructure is huge, well you already crossed the "small player" line imo...

@alexmercier It doesn't need to be "that hard" to be one little step along a slippery slope.

This particular step is also backwards incompatible. Even if sites upgrade, there's no tool to globally search/replace all the old, insecure http:// links other people have made to your site.

So, either you keep HTTP working so those links keep working, or you break a piece of the historic web.

Some people choose to break the web. This trend worries some people. ๐Ÿคท

@HerraBRE sincere question: if this is a site you still use today and plan to upgrade it to HTTPS, why not setting a redirect 301 at the same time? You can't control how people write your address but you can direct them once they come.

I tend to agree that breaking the old web isn't ideal (nor necessary)

@HerraBRE In the end, there is no such thing as a secure computer. Computers don't understand the human motivations that lead to a concept of security. There are just reliable, transparent computing systems, and ones that are readily subverted.

Building brittle systems in a wobbly environment is guaranteed to result in cracks and sharp edges.

@HerraBRE
@rysiek
One of the most important things I've learned about security during last few years is that it consists of not just Confidentiality and Integrity, but also Availability. So a situation where you can't connect to a server, or decrypt a message, is still treated as a security failure. There's often (always?) a tradeoff between Availability and the other two components, but it's not true that security people don't care about Availability.

@Wolf480pl @rysiek This is an enlightened way to look at things, and I applaud you for it.

I don't think everyone is so enlightened, but hopefully we're getting there.

@Wolf480pl @HerraBRE +1

One of the many ways to think about it is: if your service is so secure that it's basically unavailable to your users, they will find ways around it, most probably using a way less secure service/channel.

@rysiek @Wolf480pl @HerraBRE

Security = Fragility

Adding security does mean there are more things that can happen where you won't be able to run the program.

Security <> Poor usability

You should be able to code good usability right on top of your security. Ideally the user won't know or care about the security. Just assume it's there.

@hairylarry @rysiek @HerraBRE

>Security = Fragility

I think you went too far here.
I'm pretty sure there are thousands of ways to make a program fragile without making it any more secure than it was before.

@hairylarry @rysiek @HerraBRE

Also, ocnsider a program that accpets incoming network connection, and has a buffer overflow bug.

It's fragile, because sending it a wrong kind of network packet will make it crash, i.e. stop working.

It's also insecure, because a malicious person can send the wrong kind of packet on purpose, either gaining remote code execution (breaching Integrity) or at least purposfully crashing the program and making it not work for other users (breaching Availability).

@hairylarry @rysiek @HerraBRE
Now if someone patches that bug, the program will be both more secure and less fragile.

@Wolf480pl @rysiek @HerraBRE

Agree. I don't have to make a program better in any way to add fragility.

@hairylarry @Wolf480pl @HerraBRE you can also make a program less fragile and more secure at the same time.

In many contexts fragility is *insecurity*. If you can break things, they can leak information, or you can exploit them.

Making your design simpler, with fewer moving parts, makes it more secure and less fragile at the same time.

@rysiek @Wolf480pl @HerraBRE

Agreed. And if the insecurity leads to intrusion that is fragility.

What I was getting at is if you have a program and you decide to password protect it (adding security to an existing code base) then beside whatever fragility you already had you will also have a security module that can screw up.

The better idea, as you suggest, is to design in security, robustness and a good user interface through simplicity.

@hairylarry @Wolf480pl @HerraBRE as long as we can all agree that there is a fundamental difference between saying "Security = Fragility", and saying "In certain circumstances certain measures that are introduced in the name of security can cause additional fragility", I'm good.

@rysiek @Wolf480pl @HerraBRE

Yeah, I didn't really phrase it right. That whole transitive nature of equality thing.

@rysiek @Wolf480pl @HerraBRE

Have you seen my new web application, Collab. Still alpha so definitely fragile. But designed with simplicity in mind.

sffshortstories.com/collab