Telorand

joined 1 year ago
[–] [email protected] 1 points 41 minutes ago

Through let's be frank: movie theaters want you to smell popcorn, so you buy snacks. Smell-o-Vision would have to be more lucrative than a $15 bucket of popcorn.

[–] [email protected] 1 points 43 minutes ago

I appreciate the effort you put into the comment and your kind tone, but I'm not really interested in increasing LLM presence in my life.

I said what I said, and I experienced what I experienced. Providing me an example where it works is in no way a falsification of the core of my original comment: LLMs have no place generating code for secure applications apart from human review, because they don't have a mechanism to comprehend or proof their own work.

[–] [email protected] 1 points 48 minutes ago

I didn't say that. However, if delegation is too risky, do the work yourself.

[–] [email protected] 5 points 2 hours ago (1 children)

Neuralink test subject: Why do I smell burnt toast?

[–] [email protected] 4 points 2 hours ago (1 children)

Who would I jail? The C-officers. Your shit show, your responsibility. If you can't trust your employees, figure out why or do the work yourself.

[–] [email protected] 12 points 3 hours ago (5 children)

This will never happen. Smell-o-Vision and its successors have been in development for decades, and they all have the same issue: where to store the numerous scent liquids. You can't just digitize scent and generate it on demand with some kind of solid state device. You can't just combine three liquids to make 1000 scents—the article's analogy of combining light to make colors is overly optimistic, bordering on delusional.

The other two related problems are convenience and cost. This is 1000% a novelty, and novelties quickly lose their appeal after you experience it the first time. Who is seriously going to be going out to buy replacement cartridges for a thing that is essentially a toy?

[–] [email protected] 15 points 3 hours ago (5 children)

Seems like a paltry amount, given what savvy social engineers could do with that data.

If you don't use proper security practices, you should be on the hook for prison time at a minimum.

[–] [email protected] 1 points 4 hours ago (2 children)

It was ChatGPT from earlier this year. It wasn't a huge deal for me that it made mistakes, because I had a very specific use case and just wanted to save some time; I knew I'd have to troubleshoot grafting it into my function, but even after I pointed out that it was using depreciated syntax (and how to correct it), it just spat out the code again with even more errors and still using depreciated syntax.

All LLMs will fail like this in some way, because they don't actually understand what they're generating (i.e. they have no mechanism for self-evaluating the veracity of their statements).

[–] [email protected] 8 points 21 hours ago

There's certainly room to grow with regard to workers' rights. I think you could probably solve at least a few of them if they were covered by a union, and publishers who hire them would have to bargain for good development contract terms.

[–] [email protected] 6 points 21 hours ago

That's true. The mistakes actually make learning possible!

Man, designing CS curriculum will be easy in future. Just ask it to do something simple, and ask your CS students to correct the code.

[–] [email protected] 9 points 22 hours ago

cash treadmill

Borrowing this turn of phrase

[–] [email protected] 7 points 22 hours ago (1 children)

Bruh, what do you mean "future?" That's me right now!

 

This isn't a joke, though it almost seems like one. It uses Llama 3.1, and supposedly the conversation data stays on the device and gets forgotten over time (through what the founder calls a rolling "context window").

The implementation is interesting, and you can see the founder talking about earlier prototypes and project goals in interviews from several months ago.

iOS only, for now.

Edit: Apparently, you can build your own for around $50 that runs on ChatGPT instead of Llama. I'm sure you could also figure out how to switch it to the LLM of your choice.

view more: next ›