Oh no, you!

  • 1 Post
  • 367 Comments
Joined 1 year ago
cake
Cake day: November 3rd, 2024

help-circle


  • Not sure if it’s still in use today, but the above description was 2008 through 2012. Msdos was used for “gun timing”, which basically amounted to extreme precision when it came to opening or shutting some solenoid valves. The computer had a GPS input and a bunch of serial outputs, and a control line (also serial).

    The control line sent instructions of which solenoid to open when, a time reference was determined by the GPS, and you can probably guess what the serial outputs were for.



  • I had a similar matrix of screens at my old job: Seismic survey observer desk. Three rows tall, four columns wide.

    They weren’t all connected to the same PC, though; If I remember correctly:

    • Top row was one PC (linux)
    • Middle row was another (linux)
    • Bottom left was its own PC (It ran msdos 6.22. Excellent OS for realtime stuff!)
    • Bottom right was its own (linux)
    • middle two at the bottom was windows.

    They were all connected to a Raritan KVM switch, so I used that to select which row to control. The exception was on bottom left and bottom right which had a dedicated keyboard and mouse.

    I have a picture of it somewhere, but I can’t seem to find it.








  • I’ve done some very dodgy things with VGA cables in an effort to route the cables through narrow bulkheads. For normal computer-to-monitor-lengths this is probably fine.

    I haven’t noticed much signal degradation below 4m-ish.

    At 12m, you better solder properly and wrap some extra shielding around your splice.

    Source: I’ve ran plenty of VGA cables between bridge computers and a deck monitor on ships.




  • I’d say that a good starting point would be the smallest setup that would serve a useful purpose. This is usually some sort of network storage, and it sounds this might be a good starting point for you as well. And then you can add on and refine your setup however you see fit, provided your hardware is up to it.

    Speaking of hardware, while it’s certainly possible to go all out with a rack-mounted purpose built 19" 4U server full of disks, the truth is that “any” machine will do. Servers generally don’t require much (depending on use case, of course), and you can get away with a 2nd hand regular desktop machine. The only caveat here is that for your (percieved) use cases, you might want the ability to add a bunch of disks, so for now, just go for a simple setup with as many disk as you see fit, and then you can expand with a JBOD cabinet later.

    Tying this storage together depends on your tastes, but it generally comes down to to schools of thought, both of which are valid:

    • Hardware RAID. I think I’m one of the few fans of this, as it does offer some advantages over software RAID. I suspect that the ones who are against hardware RAID and call it unreliable have not been using proper RAID controllers. Proper RAID controllers with write cache are expensive, though.
    • Software RAID. As above, except it’s done via software instead (duh), hence the name. There are many ways to approach this, but personally I like ZFS - Set up multiple disks as a storage pool, and add more drives as needed. This works really well with JBOD cabinets. The downside to ZFS is that it can be quite hungry when it comes to RAM. Either way, keep in mind that RAID, software or hardware, is not a backup.

    Source: Hardware RAID at work, software RAID at home.

    Now that we’ve got storage addressed, let’s look at specific services. The most basic use case is something like an NFS/SMB share that you can mount remotely. This allows you to archive a lot of the stuff you don’t need live. Just keep in mind, an archive is not a backup!

    And just to be clear: An archive is mainly a manner of offloading chunks of data you don’t need accessible 100% of the time. For example older/completed projects, etc. An archive is well suited for storing on a large NAS, as you’ll still have access to it if needed, but it’s not something you need to spend disk space on on your daily driver. But an archive is not a backup, I cannot state this enough!

    So, backups… well, this depends on how valuable your data is. A rule of thumb in a perfect world involves three copies: One online, one offline, and one offsite. This should keep your data safe in any reasonable contingency scenarious. Which of these you implement, and how, is entirely up to you. It all comes down to a cost/benefit equation. Sometimes keeping the rule of thumb active is simply not viable, if you have data in the petabytes. Ask me how I know.

    But, to circle back on your immediate need, it sounds like you can start with something simple. Your storage requirement is pretty small, and adding some sort of hosting on top of that is pretty trivial. So I’d say that, as a starting point, any PC will do - just add a couple of harddrives to make sure you have enough for the forseeable future.


  • Back in the day I used Nagios to get an overview of large systems, and it made it very obvious if something wasn’t working and where. But that was 20 years ago, I’m sure there are more modern approaches.

    Come to think of it, at work we have grafana running, but I’m not sure exactly what scope it’s operating under.