#Hexbear enjoyer, absentee mastodon landlord, CNC machinist, jack of all trades

Talk to me about astronomy, photography, electronics, ham radio, programming, the means of production, and how we might expropriate them.

He/Him

  • 0 Posts
  • 8 Comments
Joined 4 years ago
cake
Cake day: May 12th, 2020

help-circle
  • As an every-day Gentoo user, there is little reason to use Gentoo unless you have specific, niche configuration requirements. For instance, if you need to use a very specific version of a piece of software with very specific build-time parameters.

    Where Gentoo shines is the ability to combine some old packages with some bleeding edge ones. If I, for some reason, want to run PostgreSQL 10 (released 2017) alongside Node.js 20 (released 2023), it is a thing I can do. This is not possible on most other distros - at least, not without side-stepping the package manager and compiling a bunch of things yourself.

    I’ve used Gentoo several times over the years, and what ultimately made me switch back was Docker’s reliance on iptables. I was using Fedora at the time, which had switched to nftables. (I don’t think this is as much of an issue now, but it was a few years back).


  • As a machinist, I try to get away with FreeCAD as much as I can. I have access to Solidworks, Creo, MasterCAM, and Esprit at work, as long as the engineers aren’t hogging all of the seats. I prefer modeling in FreeCAD though. It is what I am able to practice at home. (I have a cracked copy of Creo, but the crack only works on Windows).

    I do still have to feed this through Esprit for the CAM portion of my work though. FreeCAD’s CAM workbench is pretty much limited to routing and 3 axis milling at the moment. No turning, and definitely no wire EDM (what I normally do). That said, Esprit is fucking garbage and I have no doubt FreeCAD has the potential to do this better.

    FreeCAD isn’t wonderful at assemblies. I generally work on a component level, and this isn’t an issue for me, but the learning curve only gets steeper if you are trying to design intricate assemblies.

    None the less. I’ve used it to reverse engineer several replacement parts which remain in service, and used it to create the toolpath for one CNC program which is being used in production. I also edit my G-Code in Emacs.

    IMO, the problem isn’t that free software is incapable. The problem is if you are running an engineering / manufacturing firm you need to use software which is fully compatible with what your clients use. This is a constantly moving target which even commercial offerings struggle with. If your client designs a skyscraper in AutoCAD, you literally have no choice but to use AutoCAD. It doesn’t matter how good the AutoCAD importer is in SolidWorks. Something in your massive assembly is going to break, or you are going to waste a bunch of mechanical engineering resources trying to solve what is effectively an information technology problem.

    Of course, this doesn’t touch on the CNC controller firmware at all, which, in production, is uniformly proprietary. Predominantly FANUC controls, with some Citizen, Charmilles, Mitsubishi, and Makino sprinkled in. FANUC in particular has grown to be a pain in the ass to maintain, as they’ve been locking down as much shit as possible to combat cloned hardware. In practice, this only makes life more miserable for the shops purchasing genuine hardware to keep their machines running. At least if the Charmilles sinker EDM dies for good, I’ll still be able to play Doom on it until the riggers take it away.


  • I hear about incompatibility problems with hardware

    This only gets better with time. When Windows Vista was released, Linux actually supported more hardware than Windows did, because it never had a comparable break in driver compatibility. Nowadays, unless you are buying bleeding edge hardware which just hit the market within the past month, just about everything works. Typically, once a piece of hardware is supported by Linux, it will remain supported until everybody who knows how it works dies. Linux may suffer with bleeding edge / niche hardware, but it shines above all others in keeping that hardware useful, even when there is no market incentive for the manufacturer to continue support.

    You will run into problems here and there, but the grass isn’t much greener on Windows where I have also experienced problems with oddball hardware. The only saving grace for Windows is if you buy a computer that ships with Windows, all the drivers will be installed. If you download the installation media directly from Microsoft, you end up in the same boat of having most of the hardware working, but having to tie up loose ends yourself.

    So where do I start? I don’t even know how to choose hardware or what to look for.

    I’d look in your closet for some old computer that you stopped using. Try it there first. Nothing to lose. If you don’t have a heap of e-waste lying around, start with something inexpensive to learn the ropes, or try installing it on a virtual machine like VirtualBox. In general, just about any computer in the world will run Linux. You might just run into issues with oddball things like fingerprint scanners or wierd sensors (i.e. some laptops use accelerometers to stop spinning the hard drive if you drop it).

    I’m unsure what I have to do to stay ‘safe’ on Linux.

    This is easier to do than anywhere else. Linux comes in the form of “distributions.” The distributor hosts a package repository, and you get all (well, 98%) of your software from that repository. This is different from Windows, where it is typical to download individual applications from all corners of the internet. As long as you trust your distributor, you are generally solid as far as safety goes. The only risks come from installing third-party software - but even then - you just apply the same logic as on Windows. Where is this program coming from? Do I trust this person / organization? etc.

    The default settings are intended to be as safe as practical, and the various manuals and tutorials out there will warn you about doing stupid things. It usually requires manual intervention to make things unsafe.

    Does Linux come with a trustworthy firewall/antivirus/malware detection?

    It is rather uncommon to run antivirus software on Linux. This is typically only done on servers (for instance, a mail server screening attachments before forwarding them along to end users). You can install ClamAV, but this is redundant if you are getting all of your software straight from the distributor. In my humble opinion, antivirus software is a poor approach to security. Once a computer is infected, nothing on it should be trusted, including the antivirus software. Antivirus software is more appropriate as a data recovery tool than a prophylactic.

    There is a firewall is built in to the kernel in the form of iptables or nftables, and there are some GUI programs for adjusting them. Again, a firewall isn’t typically necessary unless you are running servers which listen for incoming connections. Typically, having your computer behind a router is sufficient. Unless your router is configured to forward incoming connections to your computer, those packets will be dropped there. Firewalls are more useful as a redundant method of making sure something like a database server, which is also configured only to accept connections from local processes, doesn’t accidentally get misconfigured and accept connections from the open Internet.

    I hear that ‘open source’ means people can check the code but how do I know if someone has checked the code—I wouldn’t know what to look for myself.

    This is a valid critique. There certainly have been times where this assumption has turned out poorly. Still, it is a better situation than completely unverifiable proprietary software. At the very least, contributors to the individual pieces of software are looking at it, as well as the distributors which need to build and package it. There are a few layers of review taking place, even if they don’t quite reach the level of a full audit.

    TL;DR: If you are just using your computer for casual web browsing and shit, try out Fedora or Ubuntu. The installation media boots to a functioning desktop, and you can try things out and see if they work before committing to installing (this is not true for all distributions though).




  • I’ve used Gentoo for the past five years or so, and off and on for another decade and a half prior. I tried out NixOS for a couple of months and thought it was pretty cool, but in practice maintaining it was not very fun. A lot of things can be set up with one-liners in your Nix file, but other simple things, like making the login prompt show up on the correct display (i.e. my monitor, not the TV) are murder. Setting up complex systems like NextCloud along with Apache and PostgreSQL requires cross-referencing the upstream manuals with the Nix manuals, diving into the actual Nix packages, and you run into a lot of edge-case bugs with little documentation. Also, NixOps is an absolute shit show (to be fair, this is 3rd party, not NixOS’s fault).

    If you can get past the absolutely useless error messages and set up a working configuration, it is very cool. Getting to that point can take several days though. Doing updates without waiting for compilation is very convenient.



  • It is a mixed bag. When reading excessively long lines of text, it becomes difficult to locate the next line after completing one. Allowing lines of text to become too long is considered poor typography for this reason. When the lines are constrained to a reasonable length, the text becomes easier to read. Think about a page from a novel, or a sheet of A4 paper. They are shaped like that for a reason. Of course, images and video are another story. Constraining the size of an image or video with such wide margins does nothing to aid visibility.