I was thinking that I would have to switch to bsd.
Finally the year of Hurd on the desktop?
I’m a technical kinda guy, doing technical kinda stuff.
I was thinking that I would have to switch to bsd.
Finally the year of Hurd on the desktop?
You are flashing the chip directly so apart from inadvertent short circuits and such if it doesn’t work you can just keep trying until it does.
As for wire length it all depends on how fast they clock the SPI bus when flashing. You’ll probably be able to get away with 20cm or so without difficulty , I’ve driven SPI displays with that kind of wire length before.
Something like a raspberry pi or equivalent, and use reverse SSH set up to connect to a server with a known address on your end.
This means that ports don’t need to be opened on their end.
Also if you go with a gateway host, shift SSH to a randomised port like 37465, and install fail2ban.
Microsoft is shit. Windows, is shit. Windows 11 is a privacy goddamn nightmare.
But in the end of the day, it just fucking works, those damn bastards ensure that. And even when something doesn’t work, it seems, for some unknown reason, most of the online solutions do fix the issue.
Hahahahahahahahahahaha
(Pause for breath)
Hahahahahahahahahahaha
Only if you count “most of the online solutions” as “run SFC /SCANNOW and if that doesn’t work, just reinstall your OS”.
As another poster has mentioned, M-Discs are written using a Blu-ray writer and are good for a few hundred years, in theory.
Blu-Ray USB drive and M-Discs is about the best you can get at present. Keep the drive unplugged when not in use, it’ll probably last 10-20 years in storage.
Seeing as there hasn’t been much advance past Blu-ray, keep an eye out for something useful to replace it in the future, or at least get another drive when you notice them becoming scarce.
I don’t think there’s anything commercially available that can do it.
However, as an experiment, you could:
You could probably/eventually script this kind of operation if you have software that can automatically identify and group images.
They need to learn how to use their tools better. Winscp does all that transparently for you if you press F4 on a file on a remote system. Or maybe they did and you just didn’t see it…
It’s quite a handy function when you’re diving through endless layers of directories on a remote box looking for one config file amongst many.
If library devs do versioning correctly, and you pin to major versions like “1.*” instead of just the “anything goes” of “*”, this should not happen.
Your unit tests should catch regressions, if you have enough unit tests. And of course you do, because we’re all operating in the dream world of, “I am great and everyone else is shit”.
If you’re interested in the systems behind Apollo, go find and read “Digital Apollo”.
It goes all the way through the project and describes in good detail everything, how they developed the control systems, the computer hardware, how the software was designed, how they implemented one of the first real computer systems project management, all the interactions between astronauts/test pilots who still wanted to “manually fly the lander”, the political back and forth between competing teams, the whole thing.
It’s a great read if you have a technical mindset.
how the IT team tries to justify being locked into Microsoft, and then telling me I could potentially become a point of vulnerability
Because they can manage and control all the windows PCs , pushing updates automatically, restricting what users can do locally and on the network, they have monitoring tools and whatever antivirus and antimalware tools they have, and are able to easily manage and deploy/remove software and associated group licensing and so on and so forth.
Meanwhile you’re a single user of unknown (to them) capabilities that they now have to trust with the rest of their system, basically.
The first rule of corporate IT is, “control what’s on your network”. Your PC is their concern still, but they have no effective control over it. That’s why they’re being a bit of a pain in the ass about it.
“Hey Pizza Shop, it’s The Law here. Did you have any orders for an ‘A. Tate’ recently? You did? Where did you deliver them to? Ok, thanks.”
Stupider things have happened and if I was a detective you’d be damn sure I’d at least give this a try.
True. Hence my caveat of “most cards”. If it’s got LEDs on the port, it’s quite likely to signal which speed it is at with those LEDs.
I haven’t yet come across a gigabit card that won’t do 10Mbit (edit: switches are a different matter) but sometimes I’ve come across cards that fail to negotiate speeds correctly, eg trying for gigabit when they only actually have a 4 wire connection that can support 100Mbit. Forcing the card to the “correct” speed makes them work.
in which case I will go one level down, to the
calculateExtraCommissions()
method.
In which case you will discover that the calculateExtraCommissions() function also has the same nested functions and you eventually find six subfunctions that each calculate some fraction of the extra commission, all of which could have been condensed into three lines of code in the parent function.
Following the author’s idea of clean code to the letter results in a thick and incomprehensible function soup.
Energy efficiency can be offset by extra computational ability though.
Eg Linux has a plethora of CPU and IO schedulers and allows you to tune the system to maximise performance for your particular workload. Getting more performance than with the generic CPU and IO schedulers provided in other OS’s generally means more power consumption, unless you do some sort of “performance per watt” calculation to take that into account.
For later reference, the link light on most network cards is a different colour depending on link speed. Usually orange for 1G, green for 100M and off for 10M (with data light still blinking).
I have not cared about or terminated A-spec after network cards gained auto MDI/MDIX about 20 years ago.
But it’s three more letters. No deal.
Yeah , it’s really a little strange in OPs case, I can’t really recall changing a CMOS battery in ages, like decades of computer use.
Interfaces can be needlessly complex regardless of being flat or skeuomorphic.
But flat interfaces still require mental effort to parse. Especially when the interface is complex and/or crowded and you’re trying to pick out active UI elements amongst decorations like group boxes/panels.
Essentially, flat interfaces are currently popular because of touchscreen devices. Touchscreen devices have limited space and thus need simplistic UI elements that can be prodded by a fat finger on a small screen.
But I don’t need a flat touchscreen-friendly interface on my non-touch dual 24" monitors with acres of screen real estate. I need an interface that nicely separates usable UI elements from the rest of the application window. That means 3D hints on a 2D screen, which allows my monkey-brain with five million years of evolved 3D vision the opportunity to run my “click the button” mental command as a background process.