• 0 Posts
  • 35 Comments
Joined 1 year ago
cake
Cake day: June 10th, 2023

help-circle
  • If you want to start the most effective, upgrade your router or primary switch to 2.5G or 10G. Then at least there is a low likelihood of a bottleneck when your devices are communicating internally with each other and youll have overhead downstream. Then, if you have multiple switches, prioritize the highest bandwitch between them over upgrading your devices beyond 1gb nic’s.

    I use an opnsense router with 2.5g nic’s, and then I have a 2.5g switch and a 1gb switch than are connected via a 10gb fiber link. (This is all enterprise ubiquity level stuff). But all my downstream devices and switches are 1gb snd I have no plans to upgrade intentionally. Internally, I won’t see bottlenecks often since communication between the switches and modems is enough to support multiple devices spamming 1gb/s file transfers simultaneously (not that itll happen often lol)

    So my WiFi access points, primary NAS, and my most used PC are all on 2.5gb connections since they could benefit. But everything else is on 1gb since the switch has way more ports and was way cheaper.

    I’m not against buying 10g switches for future proofing, but they’re still too costly for my needs, and its unlikely I’ll wish I had 10g any time soon esp when it comes to internet. Even if I upgrade beyond 1gb fiber service, it’d be so thay multiple devices can fully saturate a 1gb NIC at the same time, not so one computer can speed test 3gb+.

    Thay said, what I have is overkill, but i enjoy some homelab tinkering.


  • Most likely fiber. Around here the ADSL provider (CenturyLink) was the first to start deploying fiber to compete with cable able to do 1gb (which is, of course, highly variable and full of asterisks because coax, quality to neighbors modems to support a stronger mesh, possible MoCA interference, etc.)

    More recently they rebranded fiber as a different company… Probably to get rid of the DSL name stigma.



  • I mean, the issues were present and widely reported for several months before Intel even acknowledged the problems. And it wasn’t just media reporting this, it was also game server hosts who were seeing massive deployments failing at unprecedented rates. Even those customers, who get way better support than the average home user, were largely dismissed by intel for a long time. It then took several more months to ship a fix. The widespread nature of the issues points to a major failure on the companies part to properly QA and ensure their partners were given accurate guidance for motherboard specs. Even so, the patches only prevent further harm to the processor, it doesnt fix any damage that has already been incurred that could amount to years off of its lifespan. Sure they are doing an extended warranty, but thats still a band-aid.

    I agree it doesnt mean one should completely dismiss the possibility of buying an Intel chip, but it certinally doesn’t inspire confidence.

    Even if this was all an oversight or process failure, it still looks a lot like Intel as a whole deciding to ship chips that had a nice looking set of numbers despite those numbers being achieved through a degraded lifespan.





  • As a side note, if you work somewhere that uses 1password, you can usually get your personal subscription comped as an individual. Only need to pay for it if you leave your company or they drop 1password.

    I dont know that I’ll stay on 1password forever, but on the scale of things I’m most concerned about self-hosting vs using a reasonably private SaaS, 1password is nowhere near the top of my list to ditch. Otherwise, its a solid recommendation for non-self hosters who want to make some progress.


  • The space example is extremely apt. Its possible we could have had tons of space stations, a moon colony, maybe even some other stuff going on around the solar system, asteroid mining, etc. But thay would have at least required the space race to continue longer and for spending to grow to create a big enoigh industry to ensure thay outcome, assuming no capacity or time issue. Alas, we took another path.

    Something that seems important to us might not matter in even 10 years, or at least, not have a monetary and/or societal incentive to keep advancing.




  • I’ve also had struggles with arch with printing, more so than debian-based distros. EndeavourOS is where i did the most troubleshooting, but its also a problem on my manjaro install (whicj ill move to endeavour… Someday) But learning how to use cups directly was worth it.

    Currently, printing via GUI is like 5ppm and very low dpi so… Not great. But at least I can print for the casual use cases out of the box and could work out a terminal solution if I needed to in the meantime.

    I don’t print much so haven’t put time into getting things working better for bigger jobs, but printing is definitely going to be a more hit/miss experience with arch. Its looking like better GUI experience for my specific model will require a driver from the AUR or scripting the Debian install from brothers drivers site. But my model is apparently not as widely used and just hasn’t gotten as much community support I guess



  • Personal experience bias in mind: I feel like owners and managers are less interested in resolving tech debt now vs even 5 years ago… Business owners want to grow sales and customer base, they don’t want to hear about how the bad decisions made 3 years ago are making us slow, or how the short-term solution we compromised on last month means we can’t just magically scale the product tomorrow. They also don’t want to give us time to resolve those problems in order to move fast. It becomes a double-edged sword, and they try to use the “oh well when we hit this milestone we can hire more people to solve the tech debt”… But it doesn’t really work that way.

    Its also possible I’m more sensitive to the problem now that I’m in them lead/principal roles rather than senior roles. I put my foot down on tech debt a lot, but sometimes I can’t. Its a vicious cycle and it’ll only get worse the longer the tech sector is stuck in this investor-fueled forever-growth mindset.

    Too much “move fast and break things” from non-technical people, not enough “let’s build a solid foundation now to reap rewards later”. Its a prioritization of short term profits. And that means we, the engineers, often get stuck holding the bag of problems to solve. And if you care about your work, it becomes a point of frustration even if you try to view the job as just a job.


  • Until recently, Wayland development was rather slow, especially in the areas where more specialized software run into issues that force them to stick with X11. Since Wayland does a lot less than X11 and is more componetized across multiple libraries designed to be swappable, some of these areas simply do not have solutions. Yet.

    And, as always with FOSS, funding is a big part of the problem. The recent funding boosts the GNOME foundation received have also led to some increased funding for work on Wayland and friends. In particular, accessibility has been almost nonexistent on Wayland, so that also means that if an app wants to ensure certain levels of accessibility, they can’t switch to Wayland. GNOME’s Newton effort is still very alpha, but promising.

    While big apps like blender and krita get good funding, they can’t necessarily solve the problem themselves by throwing money at it, either. But the more funding Wayland gets to fill in the feature-gaps and ease adoption, the sooner we’ll be able to move away from xwayland as a fallback.

    Wayland and its whole implementation process certinally aren’t without fault. There’s a lot of really justified anger and frustration all around. Even so, staying on X11 isnt a solution.


  • While I found ubuntu’s business practices (all the upsells, mostly) the most grating, really the thing that pushed me off of Ubuntu was packages being behind inexplicably and all the forking/modifying they did to gnome and just always being like 1-2 major versions behind, especially since gnomes been shipping tons of features the last few years and Ubuntu wouldn’t get them for ages.

    Outside of the snaps that Ubuntu seems to force you back into if you purposely try to turn it off, its not the worst to avoid otherwise. Or just deal with for a few apps.

    If they want the ubuntu stack of tooling, suggest debian. If they feel intimidated by Debian, Ubuntu is fine. Debian is really solid out of the box for a primary devices nowadays. no need to wait for Ubuntu to bless packages since the Debian ppa’s are usually much faster to update. But as long as they aren’t doing really weird stuff, they can always move off of Ubuntu to Debian or any other debian descendant easily if they want a smooth transition since its the same package manager.

    As long as the immutable distro paradigm isnt a turn off for them, Vanilla OS is also really neat, including cross-package manager installs. V1 is Ubuntu based, v2 will be Debian based (if it isnt already GA’d… I know thats soonish)

    I’ve mostly switched to using Debian for dev containers and servers, and 99% of the time any ubuntu-specific guides are still perfectlh helpful. I moved to Arch for main devices.

    (Side note: I abandoned manjaro for similar reasons as I abandoned Ubuntu: too much customization forced upon me, manjaro’s package repo was always behind or even had some broken packages vs the arch repos, and some odd decisions by the maintainers about all sorts of things. EndeavourOS has been just way better as someone who likes to have a less-dictated setup that is closer to the distro base and faster to get package updates)

    Edit: I guess my tl:dr is… If one thinks “Ubuntu”, first ask “why not debian?”, and then proceed to Ubuntu if there are some solid reasons to do so for the situation.



  • Power users love to bash accessibility features like this. Its a classic case of “I don’t need a wheelchair ramp so i dont know why the library added one!”

    Accessibility is way more than screen readers. It’s more than specific disability-minded modes. The web needs to be friendly to everyone, including people who may not know they could benefit from accessibility features. Everyone benefits from this type of work.

    There are definitely some legit feature concerns and priorities being called out here. Mozilla has left a lot to be desired of late on that front. But a power user is more than capable of jumping into settings or about:config to turn things like this off, or finding an extension to get by for now.

    Also the firefox dev team isn’t tiny. This isn’t blocking other work or anything in a substantial way, it’s a fairly isolated piece of UI, and there’s no guarantee that skipping this would change the timeline on anything else.