• 0 Posts
  • 69 Comments
Joined 1 year ago
cake
Cake day: July 8th, 2023

help-circle
  • Case-sensitive is easier to implement; it’s just a string of bytes. Case-insensitive requires a lot of code to get right, since it has to interpret symbols that make sense to humans. So, something over wondered about:

    That’s not hard for ASCII, but what about Unicode? Is the precomposed ç treated the same lexically and by the API as Latin capital letter c + combining cedilla? Does the OS normalize all of one form to the other? Is ß the same as SS? What about alternate glyphs, like half width or full width forms? Is it i18n-sensitive, so that, say, E and É are treated the same in French localization? Are Katakana and Hiragana characters equivalent?

    I dunno, as a long-time Unix and Linux user, I haven’t tried these things, but it seems odd to me to build a set of character equivalences into the filesystem code, unless you’re going to do do all of them. (But then, they’re idiosyncratic and may conflict between languages, like how ö is its letter in the Swedish alphabet.)



  • I had a 13" black and white television in my bedroom when I was a teen. The big, color Trinitron TV that we got later was amazing. Beyond that, I don’t recall the improvement in quality making sitcoms funnier, or the stories better.

    In fact, to me, the old, fuzzy NTSC video is better in some ways. It helps with the suspension of disbelief, the feeling of watching a story on the screen. Even 1080p is sometimes too good, to the point that the actors fall into the Uncanny Valley, like I’m watching a live play, but not quite. Instead of a story, I see the makeup on skin, the wardrobe choices, the blocking, and the bad CGI backgrounds.

    I can certainly hear the quality differences in audio, but I feel like past a certain minimum, I’m listening to the music, not the equipment. Like, my Shokz had a noticeable lack of bass when I got them, but I’ve adapted, and don’t hear them that way any longer. The convenience of open-ear headphones far exceeds any gain in quality.



  • Just spitballing here, but if I read this correctly, you pulled the Windows drive, installed Mint, and then put the Windows drive back in alongside the Mint drive? If so, that might be the issue.

    UEFI firmware looks for a special EFI partition on the boot drive, and loads the operating system’s own bootloader from there. The Windows drive has one. When you pulled the Windows drive to install Mint on another drive, Mint had to create an EFI partition on its disk to store its bootloader.

    Then, when you put the Windows disk back in, there were two EFI partitions. Perhaps the UEFI firmware was looking for the Windows bootloader in the EFI partition on the Mint disk. It would of course not find it there. In my experience, Windows recovery is utterly useless in fixing EFI boot issues.

    It’s possible to rebuild the Windows EFI bootloader files manually, but since you don’t mind blowing away both OS installs, I’d say just install Mint on the second drive while both of them are installed in the system, so the installer puts the Mint bootloader on the same EFI partition as the Windows one. With the advent of EFI, Windows will still sometimes blow away a Linux bootloader, but Linux installers are very good at installing alongside Windows. If it does get stuffed up, there’s a utility called Boot-Repair, that you can put on a USB disk, that works a lot better than Windows recovery.