- cross-posted to:
- technology@beehaw.org
- cross-posted to:
- technology@beehaw.org
As a normal user, I don’t find Ai useful.
Like, anybody’s, for much of anything other than generating fever-dreams and Plex art.
code, tho.
Bash scripts, maybe but, it’s not necessary for me.
It’s almost like siri already does what people want it to, and anything beyond that is a waste of time and resources
It still needs to learn. I’m personally trying to opt out of it watching everything I do, will have to be some pretty serious benefits for me to revert.
How long does it take to learn? It should be able to scan all of my files locally and I should be able to search for songs by lyrics or images by description and metadata locally. It supposedly has GitHub copilot like functionality in xcode…
Get it off! Get it off!!
Is it worth the hype that Apple and cell carriers are throwing at it? Not really, but do I, as a user, enjoy a lot of the new features? For sure.
Double tap to type to Siri is great, and access to ChatGPT for answers Siri doesn’t know is much better than, “I couldn’t find the answer. Would you like me to search the internet?” And as a person with slight dyslexia and ADHD, Proofread is a fucking god send.My only gripe is the lack of ChatGPT and search internet for answer options. I want to use both in various situations when Siri doesn’t get the answer directly.
That’s a fair point.
The only bit of excitement I’ve experienced about this, was when they announced it will be force-disabled in Europe, so I didn’t have to turn it off myself
Oh don’t worry, it’s coming to the eu
That’s horrible news
Tbf, it’s opt-in for now, so it isn’t that bad yet.
I’m very much enjoying the GenMoji stuff. Being able to send or react with an emoji tailored to the situation is not useful, but it’s fun when you come up with a good one.
Also Siri is definitely more functional than it used to be. It understands when I correct myself or change my mind. Very handy. Still far from perfect though.
Also on iPad all the AI-driven handwriting cleanup and stuff is really nice when taking notes.
But otherwise it’s not super useful. I don’t like the notification summaries, they aren’t very good. Though they are sometimes hilarious. Like Ring being summarized as “Thirteen people at your door and gunshots heard.”
How do you use genmoji? Is it not rolled out to everyone on 18.2? I looked on YouTube and it’s showing that an an option in iMessage that doesn’t display for me.
Gotta have iPhone 15 Pro, 16, or 16 Pro, unfortunately. Or an M-series iPad.
I have that. Still not getting the option.
Ah! Have you previously enabled Apple Intelligence? I think in 18.1 there was a queue thing to set it up. Could be that. I also vaguely remember there being something it had to download for the image stuff. Check in Settings, maybe? I was half asleep when I got it working so I don’t remember too well.
Thanks I think I figured it out. I had siri set to German, but once I switched it to English it started displaying in iMessage. Still says it’s downloading so can’t actually use it yet. Must be English exclusive?
Not surprising, I vaguely recall them saying they would roll out other language support over time.
This whole staged rollout very much smells of, “We were caught out by the industry shifting hard to AI and, despite including neural engines in our chips for a while, we weren’t ready.”
Yeah Siri being more understanding is pretty nice and has gotten me to actually use it more again, but beyond that none of it is super useful to me.
…I did enjoy finally getting to make “shrimp with cowboy hat” using Genmoji after Apple kept using that as an example, though
I made a “Sanderlanche” emoji for use when discussing Brandon Sanderson’s mastery of story structure. Reading every one of his books I reach a point where it feels like I have to frantically push to the end.
It’s just a book with a big vaguely-snowy wave coming out of it, but I like it.
I went into settings on my phone and disabled it immediately
For me the best new feature on macOS is the ability to natively put the temperature in the menu bar. You click on it, and it gives you some more info and from there can launch the full weather app.
It’s a small addition and could have been there for a decade, but I like it a lot.
MacOS didn’t have that before? That’s impressive. Windows has had it for a long time, and KDE obviously does too, and with KDE you can put it anywhere. I can’t understand why people still act like Apple products are premium.
Yep. Apple was trying to get things OUT of the menu bar for a long time. I dunno why current leadership has changed their tune.
Well you can install a menu bar app if you want. Just like you can with any other system. I don’t ever care about the weather so I want my menubar as clean as possible, so I use bartender to hide almost everything.
for checking the wheather i actually use Windows.
they only added it to my device like yesterday but I can already tell it’s just Siri with a different screen effect. the image generator is the least intuitive prompt I’ve seen yet.
Genmoji is a waste of space. The image generation is really bad (but then again, most of these platforms are). The writing tools are mediocre. About all that is moderately useful is that Siri seems a little better and processing commands.
If they want to start charging for this, I’m out.
If they want to start charging for this, I’m out.
I’m not sure why they would charge for it, most of it happens on-device.
True, but the RnD ain’t cheap. And, if everyone else starts charging (as I am sure they eventually will), Apple will follow.
That’s why Apple charges an arm and a leg for RAM.
I am interesting in AI stuff, but I do not want it controlled by some company, I’d like to run my own models.
Yup. Photo cleanup was cool to try once, but I’ll never use it again. Removing stuff from photos with a single tap also bugs me a bit in general, I’m not sure it’s something we should make so easy. Message summaries are absolute shit and have already caused confusion for me. I’m not even talking about the proper notification summaries, just the auto-summaries in the preview lines of the whole iMessage list. A number of them have really fucked with me. For example, a friend asked me to FaceTime her in a few days, and the summary just said “FaceTime request.” And I was like “shit, did I miss a call?” As far as I can tell I can’t turn that off without disabling the entire AI setting.
I’m also not sure how to feel about all of Apple’s privacy talk when it comes to their AI features. They say certain features will stay on device, which is great, but for everything else, as far as I’ve noticed there is no mention of what goes to OpenAI’s servers, since their AI is still primarily powered by OpenAI. There’s actually no mention of OpenAI in any of the disclaimers or warnings I read when I first enabled it.
There’s no way OpenAI is letting Apple use them for free. So where is the money coming from? AI is the hot new thing and expensive to operate so I imagine Apple is paying quite a lot. There needs to be a new form of income or this wouldn’t make sense from a business perspective. I image there is data harvesting from the AI service.
I’d actually be surprised if Apple pays anything to OpenAI at the moment. Obviously running some Siri requests through ChatGPT (after the user confirms that’s what they want to do) is quite expensive for OpenAI, but Apple Intelligence doesn’t touch OpenAI servers at all (just Siri has ChatGPT integration).
Even then, there’ll obviously still be a lot of requests, but the problem OpenAI has is that they aren’t really in a negotiating position. Google owns Android and so most phones default to Gemini, instantly giving them a huge advantage in marketshare. OpenAI doesn’t have its own platform, so Apple having the second largest install base of all smartphone operating systems is OpenAI’s best chance.
Apple might benefit from OpenAI but OpenAI needs Apple way more than the other way around. Apple Intelligence runs perfectly fine (I mean, as “perfectly fine” as it currently does) without OpenAI, the only functionality users would lose is the option to redirect “complex” Siri requests to ChatGPT.
In fact, I wouldn’t be surprised if OpenAI actually pays Apple for the integration, just like Google pays Apple a hefty sum to be the default search engine for Safari.
Yeah, you nailed it. The latest reporting on this says that Apple isn’t paying them yet, because they think OpenAI will get more benefit out of just having their product in everyone’s faces:
Apple isn’t paying OpenAI as part of the partnership, said the people, who asked not to be identified because the deal terms are private. Instead, Apple believes pushing OpenAI’s brand and technology to hundreds of millions of its devices is of equal or greater value than monetary payments, these people said. Source.
You can turn off (specifically) Message summaries in settings > Apps > messages > Summarize Messages
Well shit, thank you. I swear I searched for that setting before, but there it is!
iOS settings are like mirages I swear. Whoever designed the UI should be declared criminally insane
Omg I really thought I was the only one. I can never find the setting I want. I think it’s one of those examples where Apple oversimplified to the point of confusion.
deleted by creator
Apple Intelligence isn’t “powered by OpenAI” at all. It’s not even based on it.
The only time OpenAI servers are contacted is when you ask Siri something it can’t compute with Apple Intelligence, but even then it clearly asks the user first if they want to send the request to ChatGPT.
Everything else regarding Apple Intelligence runs either on-device or on their “Private Cloud Compute” infrastructure, which apparently uses M2 Ultra chips. You then have to trust Apple that their claims regarding privacy are true, but you kind of do that when choosing an iPhone in the first place. There’s some pretty interesting tech behind this actually.
I appreciate the clarification! I definitely misinterpreted the reporting about this, and clearly didn’t dig deeply enough. This makes me feel a bit better about using the non-ChatGPT features.
I appreciate the summaries on my notifications. Some of my people text a book every time.
it’s just one big pile of meh. but then i dont even use siri, so i’m not really the target audiance for anthropomorphized chatbots.
As the general rule I feel the same about more or less all of the “AI” that is available to consumers from the likes of Google, OpenAI, etc.
It just seems more like a different way to do things with digital assistants or search engines that we have already been able to do for years.
IMHO, they have pretty different use cases. For example, I can’t use a search engine to compose things.