• 0 Posts
  • 9 Comments
Joined 1 year ago
cake
Cake day: June 10th, 2023

help-circle

  • It’s hard to explain. A lot of it is about vibes and focus over the last several years.

    1. There’s a popular suspicion that, rather than fixing issues, Dems allowed them to persist so they could campaign on them during an election year.
    2. Dems’ platform in 2016 was: Hillary’s more competent. In 2020: Trump’s a menace. In 2024: Trump’s a menace. Meanwhile, people cared more about putting food on the table, not dying of the plague, and war crimes. Sure, welfare was part of Dems plans and platform, but it weren’t the core message.
    3. Related to #2, people felt unheard, ignored, and taken for granted. We’ve been losing faith in a 2-party system, where neither side has to be good, they just have to threaten that the other side is worse. Well, wehn people feel they have nothing to lose, they put a bull in the china shop and hope they wind up on top when the dust settles.

    Bernie’s being a bit harsh in saying Dems didn’t try. Republicans blocked their efforts. But there’s also a feeling that they didn’t care all that much. At the end of the day, they’re career politicians, padding their pockets with corporate donations while demanding starving citizens vote for them because the other guy would be somewhat less palatable. And I guess Trump’s honesty about being apathetic and money-grubbing is more appealing than Dems’ feigned innocence and solidarity.


  • I tend to agree, but there are two issues working agaonst Star Trek.

    1. Successful media appeals to broad audiences by having something to appeal to every demographic. (E.g. Don’t like politics? Stay for the lasers.)
    2. Good sci-fi (arguably stories in general) gives the best representation of both sides of a conflict, and lets them compete on their merits. So it’s possible to resonate with one side, then miss the critique (e.g. due to modest writing or selective hearing).

    So while Star Trek tends to show progressive values winning in the end, many people can enjoy other aspects (e.g. military stories, relationships, and action) while ignoring the upshot.






  • For LLMs, I’ve had really good results running Llama 3 in the Open Web UI docker container on a Nvidia Titan X (12GB VRAM).

    For image generation tho, I agree more VRAM is better, but the algorithms still struggle with large image dimensions, ao you wind up needing to start small and iterarively upscale, which afaik works ok on weaker GPUs, but will gake problems. (I’ve been using the Automatic 1111 mode of the Stable Diffusion Web UI docker project.)

    I’m on thumbs so I don’t have the links to the git repos atm, but you basically clone them and run the docker compose files. The readmes are pretty good!