• 0 Posts
  • 13 Comments
Joined 6 months ago
cake
Cake day: June 8th, 2025

help-circle

  • Zeroth, consider GrapheneOS on that Pixel.

    First, Syncthing on the PC and Syncthing-Fork. Now you can sync (and anything else) your photo files from phone to PC and vice versa. Congrats, you have photo storage backup.

    Second, either a vpn to your home network so you can backup on the road, or Immich (as elsewhere suggested) for your own google photos experience.

    Third, whichever of second you didn’t choose.

    Fourth, get ye an offsite backup (search 3-2-1 backup). rclone is your friend, but encrypt first locally with Cryptomator, then you don’t have to trust your storage provider.


  • Shes all switched to linux, and if her trial goes well and i don’t end up tearing my hair out doing tech support. I may switch over as well, probably a different distro though.
    That’s an interesting approach, I usually experiment on stuff myself before making others switch, makes me more comfortable on the stuff and more confident that I’ll be able to solve their issues.

    Yeah, that was the standout for me too. You’re on winblows and might switch later, but are going to support her, way to treat the gf as a lab rat…




  • ROCm works just fine on consumer cards for inferencing and is competetive or superior in $/Token/s and beats NVIDIA power consumption. ROCm 7.0 seems to be giving >2x uplift on consumer cards over 6.9, so that’s lovely. Haven’t tried 7 myself yet, waiting for the dust to settle, but I have no issues with image gen, text gen, image tagging, video scanning etc using containers and distroboxes on Bazzite with a 7800XT.

    Bleeding edge and research tends to be CUDA, but mainstream use cases are getting ported reasonably quickly. TLDR unless you’re training or researching (unlikely on consumer cards) AMD is fine and performant, plus you get stable linux and great gaming.







  • It’s PCIe 4.0 :(

    Boo! Silly me thinking DDR5 implied PCIe5, what a shame.

    Feels like they’re testing the waters with Halo, hopefully a loud ‘waters great, dive in’ signal gets through and we get something a bit fitter for desktop use, maybe with more memory (and bandwidth) next gen. Still, gotta love the power usage, makes for one hell of a NAS / AI inference server (and inference isn’t that fussy about PCIe bandwidth, hell eGPU works fine as long as the model / expert fits in VRAM.