It’s not Jellyfin, but here’s my N100 simultaneously doing two 4K HDR transcodes with tone mapping enabled. Neither stream had buffering.
So it’s definitely a capable chip, but might be dependent on transcode settings.
It’s not Jellyfin, but here’s my N100 simultaneously doing two 4K HDR transcodes with tone mapping enabled. Neither stream had buffering.
So it’s definitely a capable chip, but might be dependent on transcode settings.
Indeed. Sounds like in your case the i5 6500 you have is already suiting your needs, so really no need for more expense. For someone who doesn’t have something like that already though and needs to make a purchase, I’ve come around to generally recommending something like the n100 over a used older-generation processor simply because they cost very similar prices, but I feel you get a bit more with the more recent chips due to the modern HW encode/decode and low power use.
The n100 mini PCs are a fantastic choice for hosting media server software primarily because of its transcoding capabilities.
The i5-6500 you have and the N100 perform very similarly with general compute tasks (though the TDW of the n100 is 6W vs 65W for the same performance). However, the N100 comes with the full Alder Lake Quick Sync engine compared to the Skylake engine on to i5-6500. If you review the hardware encode/decode table here, you can see Skylake HW encode/decode caps out at 8-bit HEVC (HDR 4K content is typically 10 or 12-bit HEVC), whereas the N100 supports even very recent codecs like 10-bit AV1. I recently set up Plex on a N100 mini PC I got for $150 (with 8gb RAM and 256gb NVMe drive included), and it was able to simultaneously do 2x 4K HDR transcodes with tone mapping while also doing a full library scan and credits detection. Of course, if you’re picky about what clients are watching your content to ensure they always watch original quality, you may not need to transcode.
That said, the N100 mini PC I purchased only has slots for 1 NVMe drive and one 2.5" SATA drive. In my case this was perfect because all my media is on a NAS which the N100 now access using a NFS mount, and I can easily back up the minimum persistent data on the N100 PC.
But it sounds like it wouldn’t 100% satisfy everything OP is looking for on its own. If they still wanted a N100 for the transcode capabilities, they may be able to use a USB HDD hardware enclosure to add additional storage capabilities without needing a separate system, but because I already had a NAS for my dedicated storage, it isn’t something I looked into with detail.
Oh I know, but my thermostat and a handful of other devices are Zwave, so for me specifically it’s probably not worth changing things up at this time.
I was weighing the same options recently and went with a n100 mini PC with 8gb RAM and 256GB m.2 SSD for $150. Absolutely no regrets.
I noticed you didn’t list storage with your RPi5. Are you just using eMMC? I’d strongly recommend against eMMC as your only storage if you’re doing anything write-intensive, since the life cycle of eMMC is generally much shorter than even cheap SSDs (and performance is much lower compared to m.2 via PCIe) and it’s not something you can just swap out if it dies. On my existing Pis and other SBCs, I use any eMMC only for the bootloader and/or core OS image (if at all) and store anything else either on physically attached SD cards, SSDs, or mounted network volumes.
This additional storage adds even more cost to the Pi, even if you go with my recommended minimum of a SD card (low life cycle, but at least you can replace it). So now the 8GB Pi is $80 + $10-15 for case with fan and heatsinks + $10-15 for power supply + $15+ for a SD card or other storage = $115-125+ total.
In comparison, the $150 n100 mini PC comes with case, power supply, and storage. Both the included m.2 256GB SSD and 8GB RAM are easily replaced or upgraded using standard SSDs and laptop memory (up to 16GB DDR4-3200). The Intel n100 scores more than twice as high in Passmark compared to the ARM Cortex A76, and includes a full Alder Lake QuickSync engine (meaning it can hardware encode/decode a large variety of video codecs with the integrated GPU including very new and demanding ones like 10-bit AV1). I’ve stress tested it recently and it was capable of simultaneously transcoding 2x 4K HDR movies (both full UHD Blu-ray quality, one of them 60fps and 100Mbps bitrate) with tone mapping in Plex in real time while also doing a full library scan and credit detection. In addition, x86 architecture is still more broadly supported than arm, so compatibility is less an issue. (That said, in this particular case, the n100 is only fully supported in newer Linux kernels. I upgraded Ubuntu 22.04.4 to 6.5 kernel and installed a few other driver packages to get it fully working, which wasn’t hard, but it’s an additional step).
For me, in the end the price difference was at most $25 and the advantages made it clearly worth it.
That said, if all I wanted was a much lower powered SBC just to run a handful of light services, I might look at one of the cheaper Pis or similar and just accept that it’ll eventually die when the eMMC dies (and back up any persistent data I’d want to retain).
That does also look like a good option. In my case, I have a Pi 4 running both zigbee2mqtt and zwave-js-ui using connected Zigbee and Zwave USB dongles placed centrally in the house (Eclipse mosquitto is running on a separate 3-server cluster). I’ve only briefly searched, but network zwave controllers seem to be much less common or more expensive, so I probably wouldn’t benefit much from changing my Zigbee controller at the moment.
I’ll check it out. I suspect configuration would likely be a little bit more complicated in my case because I’m using Authentik for proxy forward authentication and had also been using access control groups in NPM (both a LAN group and a WAN group containing Cloudflare proxy IP addresses, since currently all my publicly accessible domains proxy through Cloudflare).
This 100%. A FLAC CD rip is maybe 400MB. That’s 2,500 albums per terabyte, and I just recently got an 18TB drive for my NAS for $180. That’s $0.004 per album storage cost. I’d rather have a lossless permanent copy of any of my CDs than save fractions of a penny per album.
Nginx is a lot less painful if you use Nginx Proxy Manager. You get a nice GUI and can easily get SSL certificates with Let’s Encrypt, including wildcard certs. I’m running it in front of a docker swarm and 3 other servers, and in most cases, it takes me about 30 seconds to add a new proxy host and set it up with https using my *.domain.com wildcard cert. I also use it with Authentik as a forward proxy auth for SSO (since many containers out there don’t have the best security).
To be fair, the add-ons are just containers installed and managed by HA. In most cases, you can install all of them as separate containers via something like Docker, but configuration takes more steps (though you also get more control).
Example: I have HA, Eclipse mosquitto, zigbee2mqtt, zwave-js-ui, node-red, Grafana, and influxdb all running as docker containers on two different devices (my main HA host wasn’t ideal for Zigbee and zwave USB dongles, so those are on a Pi 4). The other containers are accessible separately or from within HA as iFrame panels.
Technically 3.5" SSDs are still out there, but they’re massive (16-64 TB) and target enterprise use (with a price to match).
And 3.5" is still the standard for platter HDDs, which are still the more economical option if you need large amounts of storage.
Now if you meant no more 3.5" floppy disk drives, then yes, those are definitely gone. ;)
I think you may have misread OPs post. They haven’t built a PC since shirtly after they were 10-11, which was almost 30 years ago. So developments since the turn of the century are in fact relevant here, heh.
This is the way. I’ve had absolutely zero issues with my Hue bulbs directly connected to a USB Zigbee controller and running zigbee2mqtt. With Zigbee bindings to smart switches, they respond practically instantly as well whenever we decide to control them that way.
The only way I see a company like this having “significant economic harm” from you not using their free app is if 1) they eventually plan to charge a fee to use the app or 2) they profit from data their app collects about you (third party data sales, for example).
Not something I’m interested in either way, so they’ve lost a potential customer.
Yup. I seem to remember most mainstream albums were around $15-20 in the 1990s. Adjusted for inflation, that’d be about $28-37 today.
I don’t actually own a Hue bridge and have never used one in my setup, but have about a dozen Hue bulbs (and additional non-Hue bulbs when “budget” options would suffice). I have HA running in Docker on my NAS and Z2M running in Docker on a Pi4 (which also is running my Z-Wave container) placed in a more central location in my house, which has a Sonoff Zigbee. They communicate with each other via gigabit Ethernet. Altogether I have about 50 Zigbee devices on my network.
It did take a bit to get everything set up and communicating with each other, and I specifically chose Zigbee channels that don’t overlap with my WiFi (since they’re both 2.4ghz). But my light response is essentially instantaneous via my HA app or a bound smart switch, so it’s definitely doable without a bridge using existing tech.
Tap water in the US costs on average about $0.01 per gallon or less. People typically drink a gallon or less per day, so about $0.30 per month. Your water bill is pretty much not affected by the tap water you drink, just the water you use for everything else.
Bottled water is easily hundreds of times the cost.
Just consult this handy chart to determine which type of catgirl you’re facing.
Arguably, if you use 2FA to access your passwords in 1password, there’s little difference between storing all your other OTPs in 1password or a separate OTP app. In both cases, since both your secret passwords and OTPs are on the same device (your phone), you lack a true second factor. The most likely way someone would gain access to 1password secured with 2FA is if they control your device and it’s been compromised, and having your OTPs separated wouldn’t provide additional protection there. Thankfully, the larger benefit of OTPs for most people is that they are one-time-use, not that they originate from a second factor.
There is one theoretical situation I can think of where having your OTPs and passwords separate could be an advantage, and that’s if someone gained all your 1password login details, including the 2FA secret key. But for someone able to gather that much sensitive intel, I’m not sure how much more of a challenge an authenticator app would be.
If you truly feel you need a second factor though, you’ll probably want to look at something like a Yubikey or Titan. I’ve considered getting one to secure my 1password vault to reduce the risk of a lost phone compromising my vault.
Makes sense. Pathfinder already shifted over to Ancestries in their 2nd Edition. Paizo has a pretty good history of representation and sensitivity to stuff like this though.