whofearsthenight

joined 1 year ago
[–] [email protected] 35 points 1 year ago (2 children)

Libertarianism is a great system if you're using it as a backdrop for a cyberpunk dystopia.

[–] [email protected] 7 points 1 year ago

Agreed. My lead at work wanted us to start trying/using Cursor.so (VS Code fork with AI as a builtin feature) and it's been pretty transformative. I don't see a lot of "hey write me a program that does x" but in my (limited) use of this, a simple "why doesn't this function work" has been pretty amazing.

I have a feeling this is a branding issue more than anything. When you could ask google plain language questions a decade ago and get responses, that seemed amazing. This to me seems like that but more advanced and I just hope they sort out the truthiness and privacy implications. On the one hand, I want the tech to advance, on the other, I would like it to not be such a privacy nightmare.

[–] [email protected] 33 points 1 year ago (3 children)

The version of this I always think of is the one in which you're playing a video game and get stuck. And unlike today, where you might spend an hour before you give up and lookup a walkthrough, in the 90's when you got stuck, you just... stayed stuck. Like, "well, I guess I'm going to spend the next week or two on the Water Temple running into every wall and bombing everything until hopefully something opens." Oh and it turns out the solution is something you tried within the first 15 minutes but didn't get quite right.

[–] [email protected] 1 points 1 year ago

It's not really because the developers are cheaper, it's because the vast reduction in complexity is cheaper. Let's say you've got a great general app idea and you're going to build a startup. Your app is going to have to be mobile and desktop. To do that well, natively, this means:

  • you're going to need a backend dev who are probably going to be building APIs that are touching on web tech.
  • You're going to need a developer team who can target Apple platforms, Android, and Windows. I lump Apple together here because although it's not entirely fair to say that it's as simple as they promise where you just click a box and your iOS app works on macOS, you're at least able to work in the same general toolset (Swift, SwiftUI, Xcode, etc.)
  • You're going to need designers who can design to the specific needs of the platforms, which is also going to mean more domain expertise.
  • testing for each of those platforms.
  • This is true regardless, but you're going to have to deal with more platform-specific support. More platform specific documentation, etc. How do you do think x on platform y? Where is the button on this platform vs that one?
  • maintaining feature parity as you continue to build is going to be much more difficult, and you're going to have to decide if you want to maintain feature parity and slow the whole process, or give up and launch on some platforms first (hopefully there is no one that uses a Mac and an Android phone or Windows and an iPhone or an iPhone and a Samsung Tablet or that gets annoying real fast.)

In short, moving from one platform to two natively doesn't double complexity and cost, it's far, far worse than that. It's not that a good web dev costs $70k vs an iOS dev that makes $90k, it's that a good iOS dev costs $90k, and a good Android dev costs $85k, and a good Windows dev costs $80k and one of those people hopefully is familiar enough with each platform to be the team lead so you can tack on another $20k for them...

And all the while you're building that team and building your 3 different platform native apps, a competitor or several will launch on Electron and web tech and take the market because no one except us nerds give a shit about whether something is using the right platform idiom or even knows what they are, and far fewer still have any idea how to check RAM usage and the like.

[–] [email protected] 6 points 1 year ago

Also, in a lot of cases, supermarket tomatoes are nowhere close to ripe. Supermarket tomatoes are generally garbage anyway, but if you can give them a day or two to ripen.

[–] [email protected] 2 points 1 year ago* (last edited 1 year ago) (1 children)

You know, as an amateur with massive impostor syndrome who's probably going to be applying for jobs soon, this comment and those like it give me strength.

[–] [email protected] 0 points 1 year ago (1 children)

Your comment, which I quoted in my reply, is itself a speculative and anecdotal piece of data. My "vitriolic" anecdote is literally just an anecdote, which is in no way vitriolic. My "inflammatory" comments? Can we not say fuck on the internet anymore?

I also work in IT supporting various tablets. Have to go back a little bit before USB C was gaining traction, but We mostly use iPads (like 4-5x more iPads) but used Fire tablets with micro USB. The iPads get used like iPads, the Fire tablets were mostly stationary. Since like 2015 when we adopted iPads, I've never had to replace due to a failed port. We stopped buying Fire tablets because I had to replace about 1/3 of them due to failure of the port in around 2 years of service vs 8 years on the iPads with no failures.

Granted, this sample size here is only like 50 devices in total so I'm sure it's just more "vitriolic" anecdata.

your perspective focusing more on negative experiences than positive, etc.

Good point. On the positive side, I've never seen a Lightning port fail in my family, my extended family, or from any of my friends, or in my job. Another positive is micro USB is almost dead and replaced by much less trash connectors. Golly gee don't tell micro USB all the micro USB stans (which I guess have shown up exclusively for this thread) I said that, would hate to offend. Of all of the things I've ever said on the internet, boy did I not expect calling micro-usb garbage to be controversial.

[–] [email protected] -4 points 1 year ago (4 children)

I’m skeptical that this happens anywhere remotely near frequently enough to condemn the entire technology. Never seen it, and never heard about it until just now.

I have three kids. I have had to buy parts off of eBay and replace the ports on both of my PS4 controllers. They see way, way less use than any of the kids iPhones, or mine or my wife's. I have a Kindle that has to be propped "just-so" to charge.

I really don't give much a shit if everyone in this thread hates apple, but micro-usb is a fucking garbage connector that frankly, USB-C shares too much in common with. If you stepped on a lightning cable that was in a device, you probably need a new cable. If stepped on a micro-usb, you probably need a new device and new cable. USB-C, you need a new cable and you might need a new device (see also, me figuring out how to fix the kids oculus this week.)

[–] [email protected] -4 points 1 year ago (1 children)

that article sources nothing.

[–] [email protected] 4 points 1 year ago

They paid for the whole disk, they're going to use the whole disk.

[–] [email protected] 2 points 1 year ago* (last edited 1 year ago)

Once. Firewire then 30pin

[–] [email protected] 1 points 1 year ago

"I see the problem, you're going to want to not chew on those."

view more: β€Ή prev next β€Ί