• 5 Posts
  • 19 Comments
Joined 1 year ago
cake
Cake day: July 11th, 2023

help-circle

  • That’s an interesting concept. I bought two weeks ago when they still had cable modems and a setup I know I could have worked with. I’m politically active so getting on the board should be an option. However, what’s in the best interest of the vast, vast majority of the owners? Your standard service that requires complex gateways and running coax all over your apartment with hardware rental fees and TV number and location limits, or a system where your smart TV can connect anywhere and your iPhone can always get onto Facebook and there’s a 24/7 tech support line to change your WiFi password for you? If it costs each owner $1 more per month (500 units) for my preferred network architecture so three residents can save $70 per month ($210) I would be failing in my fiduciary duty by charging the masses more so a select few can self host. We are the minority and the rest don’t care.



  • The setup is very strange. They don’t provide a router. They took the old phone lines going to each unit (which appears to have been done in Cat5 decades ago) and put an RJ-45 end on it. That plugs into a POE powered wireless access point with two more ports on it. Plugging my laptop in, the gateway does not respond to HTTP requests. The tech who installed it said I have to call the home office to change my wireless password. I got them to disable the wireless so I could put my router on the other end but I’m either running on a network that my shady small time ISP has full control over or I’m behind a double NAT. Speeds were 900+ up and down though.

    I might see if I can get the AP re-enabled and let the switch connect to it directly if that even fixes the Switch’s NAT issues.






  • What I need is a 10g storage for my Adobe suite that I can access from my MacBook. I need redundant, fault tolerant storage for my precious data. I need my self hosted services to be high availability. What’s the minimum spec to reach that? I started on the u.2 path when I saw enterprise u.2 drives at similar cost per GB as SATA SSDs but faster and crazy endurance. And when my kid wants to run a Minecraft server with mods for him and his friends, I better have some spare CPU cycles and RAM to keep up.



  • I’m afraid of dumping 500+ watts into a (air conditioned) closet. How are you able to saturate the 10g? I had some idea that ceph speed is that of the slowest drive, so even SATA SSDs won’t fill the bucket. I imagine this is due to file redundancy not parity/striping spreading the data. I’d like to stick to lower power consumer gear but ceph looks CPU, RAM, and bandwidth (storage and network) hungry plus low latency.

    I ran proxmox/ceph over 1GB on e-waste mini PCs and it was… unreliable. Now my NAS is my HA storage but I’m not thrilled to beat up QLC NAND for hobby VMs.


  • I looked at Epyc because I wanted to bandwidth to run u.2 drives at full speed and it wasn’t until Epyc or Threadripper that you could get much more than 40 lanes in a single socket. I’ve got to find another way to saturate 10g and give up on 25g. My home automation is run on a Home Assistant Yellow and works perfectly, for what it does.







  • I had previously run HA on a Raspberry Pi with Z-Wave, Sonoff, and some Hue, years and years ago. After searching for a Zigbee (since that seems to be the current standard and I’m starting over) adaptor that was HA compatible, I saw they offered their own hardware. After 18 months of waiting, I figured it would “just work.” Anyway, I reimaged over USB-C and it’s working. I’ll definitely do a backup before doing my first software update now that I have something to lose.





  • So some news outlets get to protect their precious little articles from the big bad AI, which will probably destroy news as we know it anyway

    I was thinking about this. What happens when all the big outlets are having AI write their news?You can’t get answers on today’s news without feeding the model today’s news. Therefore, somebody has to create the data source.

    I see a few scenarios:

    • Google scrapes, aggregates, and summarizes to the point that nobody reads the article/sees the ads and the news site goes under. Then Google has nothing to scrape but press releases and government sources. Or…
    • News sites block access to scrapers and charge for it but may be wary of crossing their customers (news aggregators) in their coverage
    • The above creates a tiered system where premium news outlets (AI assisted writing but with human insight) are too expensive for ad supported Google to scrape, so Google gets second tier news from less reliable, more automated sources, or simply makes it themselves. Why not cut out the middle man?
    • Rouge summarizers will still scrape the real news outlets and summarize stories to sell to Google. This will again make paid news a luxury since someone with a subscription will summarize and distribute the main point (okay) or their spin (bad).

    I’m failing to see where this will go well. Is there another scenario?