Earlier this year, Microsoft added a new key to Windows keyboards for the first time since 1994. Before the news dropped, your mind might’ve raced with the possibilities and potential usefulness of a new addition. However, the button ended up being a Copilot launcher button that doesn’t even work in an innovative way.
Logitech announced a new mouse last week. I was disappointed to learn that the most distinct feature of the Logitech Signature AI Edition M750 is a button located south of the scroll wheel. This button is preprogrammed to launch the ChatGPT prompt builder, which Logitech recently added to its peripherals configuration app Options+.
Similarly to Logitech, Nothing is trying to give its customers access to ChatGPT quickly. In this case, access occurs by pinching the device. This month, Nothing announced that it "integrated Nothing earbuds and Nothing OS with ChatGPT to offer users instant access to knowledge directly from the devices they use most, earbuds and smartphones."
In the gaming world, for example, MSI announced this year a monitor with a built-in NPU and the ability to quickly show League of Legends players when an enemy from outside of their field of view is arriving.
Another example is AI Shark's vague claims. This year, it announced technology that brands could license in order to make an "AI keyboard," "AI mouse," "AI game controller" or "AI headphones." The products claim to use some unspecified AI tech to learn gaming patterns and adjust accordingly.
Despite my pessimism about the droves of AI marketing hype, if not AI washing, likely to barrage the next couple of years of tech announcements, I have hope that consumer interest and common sense will yield skepticism that stops some of the worst so-called AI gadgets from getting popular or misleading people.
What are they using as input? Like, you can have software that can control a set of outputs learn what output combinations are good at producing an input.
But you gotta have an input, and looking at their products, I don't see sensors.
I guess they have smartphone integration, and that's got sensors, so if they can figure out a way to get useful data on what's arousing somehow from that, that'd work.
googles
https://techcrunch.com/2023/07/05/lovense-chatgpt-pleasure-companion/?guccounter=1
Hmm.
Okay, so the erotica text generation stuff is legitimately machine learning, but that's not directly linked to their stuff.
Ditto for LLM-based speech synth, if that's what they're doing to generate the voice.
It looks like they've got some sort of text classifier to estimate the intensity, how erotic a given passage in the text is, then they just scale up the intensity of the device their software is controlling based on it.
The bit about trying to quantify emotional content of text isn't new -- sentiment analysis is a thing -- but I assume that they're using some existing system to do that, that they aren't able themselves to train the system further based on how people react to their specific system.
I'm guessing that this is gluing together existing systems that have used machine learning, rather than themselves doing learning. Like, they aren't learning what the relationship is between the settings on their device in a given situation and human arousal. They're assuming a simple "people want higher device intensity at more intense portions of the text" relationship, and then using existing systems that were trained as an input.
Lovense is basically just making a line go up and down to raise and lower vibration intensities with AI. They have tons of user generated patterns and probably have some tracking of what people are using through other parts of their app. It's really not that complicated of an application.