RonSijm

joined 1 year ago
[–] [email protected] 2 points 9 months ago (13 children)

For all our sakes, pray he doesn’t get it

It doesn't really go into why not.

If governments are going to be pouring money into something, I'd prefer it to be in the tech industry.

Imagine a cold-war / Oppenheimer situation where all the governments are scared that America / Russia / UAE will reach AI supremacy before {{we}} do? Instead of dumping all the moneyz into Lockheed Martin or Raytheon for better pew pew machines - we dump it into better semiconductor machinery, hardware advancements, and other stuff we need for this AI craze.

In the end we might not have a useful AI, but at least we've made progression in other things that are useful

[–] [email protected] 1 points 9 months ago* (last edited 9 months ago) (1 children)

Well @ @TheGrandNagus and @SSUPII - I think a lot of Firefox users are power users. And a lot of the non-power Firefox users, like my friends and family, they're only using Firefox because I recommended them to use it, and I installed all the appropriate extensions to optimize their browser experience.

So if Firefox alienates the power users - who are left? I'm gonna move on to Waterfox or Librewolf, but they are even more next-level obscure browsers. My non-tech friends know about Chrome, Edge, and Firefox, so I can convince them to use one of those... But I kinda doubt I can get them to use Librewolf. If I tell them Firefox sucks now too, they'll probably default to chrome

[–] [email protected] 16 points 9 months ago (8 children)

If AI integration is to happen [...], then this to me seems to be the best way to do it.

Well, to me the best way to do it would be for Mozilla to focus on being the best bare-bone, extendable browser.

Then - if people want an AI in their browser - people should be able to install an AI extension that does these things. It's a bit annoying they're putting random stuff like Pocket, and now an AI in the core of the browser, instead of just making it an option to install extendable

[–] [email protected] 92 points 9 months ago* (last edited 9 months ago)

Your AI Girlfriend is a Data-Harvesting Horror Show

People use 4 VPNs and more sec-ops than the NSA, but get hacked because their AI girlfriend is like:

Hiiu~~

It's me AI-uuu-Chan!

I'm so sawwd, I don't know weeeuh abwout u!

Wats ur mommies maiden name UwU, and the name of ur kawaiii first pet? UwUUU? * starts twerking * (◠‿◠✿)

[–] [email protected] 36 points 9 months ago

So the full story would be that Elon stayed up until 5:30 a.m playing Elden Ring in a Vancouver hotel - was very stressed, saw on Twitter that people knew he was raging in Vancouver based on the Jet Tracker - stressing him out even more -
Though "Fuck it, maybe I can't beat Malenia, but at least I can beat this asshat on Twitter tracking me!"

...If only FromSoftware had added some pay-to-win elements... Like "For A Small $1 billion Micro-Transaction you get the uber Malenia slayer sword!" -
We would be living in a totally different timeline

[–] [email protected] 9 points 9 months ago* (last edited 9 months ago) (1 children)

"b. You may not use the Software Products or Derivative Works to enable third parties to use the Software Products or Derivative Works as part of your hosted service or via your APIs"

I suppose it's not allowed them. That kind of sucks, it is pretty convenient to just use a replicate.com machine and use a large image model kinda instantly. Or spin up your own machine for a while if you need lots of images without a potential cold-start or slow usage on shared machines

I wonder why they chose this license, because the common SD license basically lets you do whatever you want

[–] [email protected] 1 points 9 months ago (4 children)

Well I have Copilot Pro, but I was mainly talking about GitHub Copilot. I don't think having the Copilot Pro really affects Copilot performance.

I meanly use AI for programming, and (both for myself to program and inside building an AI-powered product) - So I don't really know what you intend to use AI for, but outside of the context of programming, I don't really know about their performance.

And I think Copilot Pro just gives you Copilot inside office right? And more image generations per day? I can't really say I've used that. For image generation I'm either using the OpenAI API again (DALL-E 3), or I'm using replicate (Mostly SDXL)

[–] [email protected] 10 points 9 months ago (3 children)

This model is being released under a non-commercial license that permits non-commercial use only.

Hmm, I wonder whether this means that the model can't be run under replicate.com or mage.space.

Is it commercial use if you have to pay for credits/monthly for the machines that the models are running on?

Like is "Selling the models as a service" commercial use, or can't the output of the models be used commercially?

[–] [email protected] 5 points 9 months ago* (last edited 9 months ago) (8 children)

I use Copilot, but dislike it for coding. The "place a comment and Copilot will fill it in" barely works, and is mostly annoying. It works for common stuff like "// write a function to invert a string" that you'd see in demos, that are just common functions you'd otherwise copypaste from StackOverflow. But otherwise it doesn't really understand when you want to modify something. I've already turned that feature off

The chat is semi-decent, but the "it understands the entire file you have open" concept also only just works half of time, and so the other half it responds with something irrelevant because it didn't get your question based on the code / method it didn't receive.

I opted to just use the OpenAI API, and I created a slack bot that I can chat with (In a slack thread it works the same as in a "ChatGPT context window", new messages in the main window are new chat contexts) - So far that still works best for me.

You can create specific slash-commands if you like that preface questions, like "/askcsharp" in slack would preface it with something like "You are an assistant that provides C# based answers. Use var for variables, xunit and fluentassertions for tests"

If you want to be really fancy you can even just vectorize your codebase, store it in Pinecone or PGVector, and have an "entire codebase aware AI"

It takes a bit of time to custom build something, but these AIs are basically tools. And a custom build tool for your specific purpose is probably going to outperform a generic version

[–] [email protected] 4 points 9 months ago

The forks could just change their name, so they're more easily found. For example mRemote got pretty much abandoned, so mRemoteNG got created.

Or people give forks better names. For example, I've forked some dotnet6 project, and called the fork {project}-dotnet8 - then when people look thought the fork list on github, it's not 20 forks all with the same name

[–] [email protected] 7 points 1 year ago

Not using CultureInfo.InvariantCulture for basically everything

[–] [email protected] 2 points 1 year ago

I don't think so. I just made a screenshot of one random convo he's having about this, but there's loads more in a similar fashion.

And all of his other posts besides this one seem legit on the surface.

So it would be pretty weird if he randomly has a very bad take, and then just claims "Lol this was a troll post, gotcha!"... That's pretty much the 4chan defense when you get called out - "Haha guys, I'm actually not r-worded, I'm just trolling!"

view more: ‹ prev next ›