towerful

joined 1 year ago
[–] [email protected] 0 points 1 month ago (5 children)

If only that was the government that invested in the R&D and tech to make it happen.
Gaining funds from taxes (meaningful taxes), and investing that money in making their country better.

Hopefully this decision is because carbon taxes that will make consumer products representative of the actual cost of the item (not the exploitative cost). >

No no, let the free market decide.
Fucking AI threatening to replace basic jobs (when it's more suited to replace the C-Suite) gobling up energy and money, too-big-to-fail bailouts and loophole tax rules bullshit.

So yeh, someone needs to spend the money and that should be the government.
Because they should realise that carbon fuel sources are a death sentence.

[–] [email protected] 13 points 1 month ago* (last edited 1 month ago) (2 children)

I agree, and it is possibly the only good thing to come out of AI.
Like people asking "why do we need to go to the moon?!".

Fly-by-wire (ie pilot controls decoupled from physical actuators), so modern air travel.

Integrated circuits (IE multiple transistors - and other components - in the same silicon package). Basically miniaturisation and reduction in power consumption of computers.

GPS. The Apollo missions lead to the rocket tech/science for geosynchronous orbits require for GPS.


This time it is commercial.
I'd rather the power requirements were covered by non-carbon sources. However it proves the tech for future use.

For a similar example, I have a strong dislike of Elon Musk. He has ruined the potential of Twitter and Tesla, but SpaceX has had some impressive accomplishments.

Google are a shitty company. I wish the nuclear power went towards shutting down carbon power.
But SOMEONE has to take the risk. I wish that someone was a government. But it's Google. So.... Kind of a win?

[–] [email protected] 3 points 1 month ago* (last edited 1 month ago) (1 children)

HDD, SSD and NVMe all have different versions. Later generations are normally 2x faster than previous version. Comparable generations are normally an 8x speedup. (Later generations are in parentheses).

HDD to SSD is like 80(160)->300(600).
SSD to NVMe is 300(600)->2400(4800, 14000).

So, it's likely a similar upgrade, unless you did HDD-g1 to SSD-g2 to NVMe-g1 (using G1/G2 to simplify).
It's also likely possible that your computer is running so fast that a doubling or quadrupling in speed is a diminishing return as you don't notice the difference.

[–] [email protected] 1 points 1 month ago

You kinda made my point with the whole "try and find another operator to send 2400bps to" part. The digital communication is not conventional, it's revolutionary.
Analog communication is conventional. And radios and their components aren't exotic.

Yes, modern communication is fantastic. But analog will still be more reliable

[–] [email protected] 8 points 1 month ago

Eventually you will get used to it.
You have 3 options.

  1. normalise to OSX shortcuts (and concile your Linux shortcuts to those). You are more likely to encounter an osx machine "in the wild", and if you have to get a new Mac then everything is instantly comfortable. Linux is also easier to customise.

  2. normalise to your Linux shortcuts. Figure out how to script osx to adopt those shortcuts (so you can quickly adopt a new work machine), and accept that you won't always be able to use those shortcuts (like when using a loaner or helping someone).

  3. accept the few years of confusing Osx Vs Linux shortcuts, and learn both.

Option 3 is the most versatile. Takes ages, and you will still make mistakes.
Option 2 is the least versatile, but is the fastest to adopt.
Option 1 is fairly versatile, but probably has the longest adoption/pain period.

If OSX is in your future, the it's option 1.
Option 3 is probably the best.
If you are never going to interact with any computer/server other than your own & other Linux machines, then option 2. Just make sure that every preference/shortcut you change is scriptable or at least documented and that the process is stored somewhere safe

[–] [email protected] 4 points 1 month ago

It's all from Latin mintus Vs minuta.
https://www.etymonline.com/word/minute

My-noot for small.
Min-ut for time.

[–] [email protected] 36 points 1 month ago* (last edited 1 month ago) (13 children)

I don't think smart phones are conventional communications. The are smart. They are still the "tech of tomorrow".
Smart phones use conventional communications to do very clever things. But those clever things are range limited and require specialised equipment. They also have absolutely no "hackability" without specialised equipment (easy to get, sure... But still pretty much single purpose)

AM is literally a couple caps, inductors, resistors (edit: and diode) then an amplifier (a couple transistors and resistors). And the range of lower frequency radio waves is (or can be) phenomenal.
It's just that it takes some experience to operate on these frequencies, and their bandwidth is limited.

Smart phones do away with the experience requirements, and trade higher frequencies & higher data rates for range (and I guess trade digital encoding for simplicity)

I see parallels to software.
People are nervous to "side loading apps" on their phone, but have no issues downloading and installing an exe on windows.
Smart phones give you the "this is how" kind of experience, and abstract away the sheer amount of technology they leverage. Which is amazing, and is what makes them smart!
But the underlying technology is phenomenal. And I feel it's a shame that the majority of people don't have any understanding of "installing an app" or similar (like calling internet access "WiFi".... 2 distinct things!)

[–] [email protected] 1 points 1 month ago

It's pretty serendipitous, actually.
The past month I've done a somewhat deep dive into LoRa for a project.
I ultimately dismissed it due to the data rates, but for simple remote controls or for sensors - things that report a couple bytes - it seems awesome.
I'm sure you can squeeze higher data rates out of it, but when I evaluated it I decided to go with a hardwired network link (I had to have stability, dropped info wasn't an option. But the client had a strong preference for wireless)

[–] [email protected] 2 points 1 month ago (2 children)

WiFi uses BPSK/QPSK/OFDM/OFDMA modulation.
LoRa uses CSS modulation.

This is about hacking WiFi hardware to make WiFi modulated signal intelligible to a receiver expecting CSS modulation, and have the WiFi hardware demodulate a CSS signal.
Thus making WiFi chips work with LoRa chips.

LoRa doesn't care about the carrier frequency.
So the fact that it's LoRa at 2.4ghz doesn't matter. It's still LoRa.

I'm sure there will be a use for this at some point.
Certainly useful for directly interfacing with LoRa devices from a laptop.
I feel that anyone actually deploying LoRa IoT would be working at a lower level than "throw a laptop at it" kinda thing

[–] [email protected] 4 points 1 month ago (4 children)

It's LoRa on 2.4ghz.
It's just that chirp signals are easy to decode from a lot of noise.
And they don't really affect most other modulation techniques. I think you can even have multiple CSS coded signals on the same frequency, as long as they are configured slightly differently.

LoRa is incredibly resilient.
It's just really really slow

[–] [email protected] 7 points 1 month ago

The issue is with how aggressive Microsoft is about it.

Trying to download chrome? "Hey, are you sure you don't want to try Edge?".
Changing default browser? "Hey, are you sure you don't want to try Edge?".
Windows update... "We've done you a solid, because we know you want to use Edge".
I'm sure at one point, it was a warning in the security center that you aren't using Edge.
Also Teams (in sure there are others) will open links in Edge, despite what default browser you have set.

[–] [email protected] 8 points 1 month ago (1 children)

Oh, I thought it was hair die that was dripping down his face.
Make sense it was ichor leaking from somewhere

view more: ‹ prev next ›