Follow

if i have to read one more fucking post about going back to the good old days of when websites were slow, clunky, barren, boring pages with black text on a white background that auto-refreshed with fucking meta tags, used iframes to divide content, and had zero javascript, i swear to fuck

being able to load a webpage on an eight year old build of opera mini for J2ME is the same thing as good web design. if your website has more than thirty kilobytes of content you are killing the internet. literally nobody has the bandwidth to load that. images are a fucking plague and should never be used

cookies? spyware. javascript? spyware. image etags? spyware. log in requirements? spyware

what's that? you need proof that the internet would be better off if every website looked like craigslist? look at the massive gopher community. look how many people are using gopher:// links every single day. HTTPS is completely eclipsed by the enormous userbase. this is what we want. HTML5 isn't accessible with its "semantic markup" and "aria descriptive captions". that's a bunch of bullshit. you know what IS accessible? raw text. if you can get your screen reader to tell you whether somethin's a quote or a sentence or a navbar, guess what? it's fucking spyware and NOT accessible

@lynnesbian
I just dont want to have to click 83 buttons in umatrix 😭😭

@lynnesbian OTOH: craigslist is actually a really well-made website, it's easily accessible, it loads real fast, it uses javascript but *not* for things that javascript shouldn't be used for(like loading static HTML content), and is accessible without javascript, and it doesn't hide all the content in a clunky interface with giant images and hard-to-find links.

@lynnesbian But they also worked properly on devices that were made outside of the last two years at time of publishing
Sign in to participate in the conversation
Lynnestodon

@lynnesbian@fedi.lynnesbian.space's anti-chud pro-skub instance for funtimes