jacksilver

joined 1 year ago
[–] [email protected] 1 points 1 week ago (1 children)

With jupyter notebooks in a devops perspective you could just build a process to export the notebooks to standard py files and then run them.

There are actually a lot of git hooks that will actually expoet/convert .ipynb to .py files automatically since notebooks don't work great with git.

[–] [email protected] 12 points 1 week ago

Yep, don't be afraid to ask for help. The people at polling places are usually very nice and helpful!

[–] [email protected] 3 points 1 month ago (1 children)

The download feature is always in some state of broken, but it has gotten a lot better over the past couple of years. If you haven't tried it in a year or so, you may have better luck now.

[–] [email protected] 2 points 1 month ago

Fair point then about the arguement around safety. For me the bigger issue is control. Cars with kill switches and conditions to use is a slippery slope. Just look at what's happened with software and media. Don't want to have to pirate my car or load custom firmware so I can use it as I want.

[–] [email protected] 6 points 1 month ago (2 children)

I don't think there is a car where the seat belt is tied to anything besides a little notification beep. Seems like a different situation if the "safety" feature dictates how the car is used.

[–] [email protected] 2 points 1 month ago (1 children)

Do you still use WASM? I've been exploring the space and wasn't sure what the best tools are for developing in that space.

[–] [email protected] 12 points 1 month ago

Also "win + - > or <-" to move a tile to left or right side.

[–] [email protected] 9 points 1 month ago

This is just the estimates to train the model, so it's not accounting for the cost to develop the system for training, collecting the data, etc. This is just pure processing cost, which is staggeringly large numbers.

[–] [email protected] 11 points 1 month ago (4 children)

LLMs do suck at math, if you look into it, the o1 models actually escape the LLM output and write a python function to calculate the output, I've been able to break their math functions by asking for functions that use math not in the standard Python library.

I know someone also wrote a wolfram integration to help solve LLMs math problems.

[–] [email protected] 4 points 1 month ago (1 children)

Not sure if you're serious, but they were making a joke because Intel, who makes chips, is a competitor to TMSC the chip manufacturer from the article.

So they played on that relationship by treating the word Intel in your "thanks for the Intel" comment as meaning the company.

[–] [email protected] 1 points 1 month ago

Just read up more about the systems and always thought they charged you more, didn't realize that for the time being they are zero interest loans.

Seems unsustainable, but sounds like they're using the credit card technique of charing the storefront. It'll be interesting to see where the bnpl industry goes.

[–] [email protected] 1 points 1 month ago (2 children)

Why be the bad guy when you can just enable them.

view more: next ›