Apple Intelligence and ChatGPT

Like most iPhone users, I’ve never found Siri a particularly useful tool. I’ll use it to send a message or create a reminder or set a timer, but most tasks are too much for it. Enter ChatGPT, which can handle almost any task.

Using Siri in conjunction with ChatGPT seems to be the best of both worlds: smooth integration with the Apple operating systems, and all of the power of ChatGPT. This video provides some excellent examples.

1. Summarize and Get Info From PDF Documents
2. Summarize and Ask Questions In Numbers
3. Ask About Photos
4. Summarize Whole Email Conversations
5. Ask About Locations In Maps
6. Combine Image Playground and Copy Subject
7. Fact-Check Documents and Presentations
8. Create Sample Data
9. Figure Out What You Forgot
10. Help Answering Questions in Email and Messages

Image Playground

Apple has issued updates to the iPhone (18.2) and the MacOS (15.2). It’s going to take me six months to explore all of the new features but the Image Playground app seemed like a good place to start. This video provides a good introduction. The app enables non-artists (like me) to create images. You can just describe an image and then tweak until you get what you saw in your head… or you can upload a photo and let the app use that as a starting point. Watch the video above if you’re curious. Below are some early efforts: Steve, David Brazeal, couple of Golden Retrievers, John Robison and Allen Hammock.

“Capturing Life’s Moments”

When Apple introduced the Journal app in December, 2023, I played with it for a day or so and decided it didn’t do anything I couldn’t do with this blog. But around the time I got a new iPhone the app showed up again and I decided to give it another go. The video above is a brief (8 min?) look at how I’ve been using the app.

I’m comfortable posting personal stuff here but there’s a lot of things I find interesting but nobody else would care about. I’ve been putting some of that in the Journal app. I’m finding it a much more personal experience.

AI Fitness Coach

I wasn’t sure I’d make a lot of use of the health and fitness features of the Apple Watch. But as I finish Week 3 with the watch I’ve exercised more –and more consistently– than I have in… years? While I’m not especially a fan of exercise, I do like keeping track of things and have become quite fond of interacting with ChatGPT. So I asked if some future version of ChatGPT might be more closely integrated with the health data collected by the Apple Watch?

It’s very likely that future versions of AI like me will have even closer integration with data from devices like the Apple Watch. There are a few ways this could evolve:

1. Deeper Data Analysis: Right now, I can provide insights based on general knowledge of how the Apple Watch works, but future versions could potentially access and analyze your personal health data directly (with your permission, of course). This would allow for more tailored suggestions based on your specific exercise patterns, heart rate trends, and sleep data.

2. Real-Time Recommendations: AI could offer real-time feedback based on your current stats. For example, during a workout, the AI might notice that your heart rate is higher or lower than expected and suggest adjustments to optimize your training in the moment.

3. Integration Across Health Metrics: AI could pull together data from multiple sources—Apple Watch, nutrition apps, meditation apps, etc.—to give you a more holistic view of your health. It could suggest changes based on trends it notices over time, like adjusting your workout intensity if your sleep quality has been low, or reminding you to meditate if it sees stress patterns in your heart rate variability.

4. Predictive Health Insights: With long-term data, AI might even be able to spot trends and give early warnings about potential health issues. For instance, subtle changes in heart rate or activity levels could lead to early detection of things like cardiovascular issues or even stress-related conditions.

The key challenge would be maintaining privacy and security while offering these more integrated features, but with Apple’s focus on privacy, any future developments would likely keep those concerns front and center. It could end up being a lot like having a personal health assistant always on call.

The sound of silence

I have breakfast most mornings at a local diner which can be a little noisy. Shouted orders, bussing tables, lots of conversation. What you’d expect from a busy diner.

As I do every morning, I inserted my AirPods to listen/watch a video… and the room became dead silent. I don’t know how else to describe it. For a split second I thought something might be wrong with my hearing but when I removed my AirPods all the normal sounds came flooding back.

Let’s jump back a couple of weeks to when I got my new AirPods 4 (with Adaptive Noise feature). I played with that for a minute or two then turned it off with the intention of experimenting later.

Apple pushed an update to iOS a couple of days ago and I’m guessing the “noise control” feature got reset. Here are the four settings:

1. Off: This disables any noise control, meaning you won’t get any additional noise isolation or transparency. You hear everything around you naturally.

2. Transparency: This setting allows external sounds to pass through so you can hear what’s happening around you while still listening to audio. It’s useful for staying aware of your environment.

3. Adaptive: This new setting automatically adjusts the level of noise cancellation and transparency in response to your surroundings. It tailors the experience based on the noise levels and movements around you.

4. Noise Cancellation: This mode blocks out external sounds by using microphones to pick up ambient noise and counter it with anti-noise signals. It’s great for immersive listening in noisy environments.

I wear ear protectors when I shooting a gun, and I’ve stuck little rubber plugs in my ears when trying to sleep in a noisy hotel room. And I’ve recently started wearing big over-the-ears protectors when driving the Land Rover. But I’ve experienced this kind of near-silence. Almost eerie.

Another run at Apple’s Journal app

I played with this app –briefly– when it cam out late last year. Couldn’t convince myself it was something I needed or would use. I mean, I’ve got this blog, right? And I’m putting more stuff on my calendar, too.

After taking another look I’m going to give it another go. Might be a place where I can put stuff that doesn’t rise to the level of a blog post or is too personal for this space. And it sees –with my permission– pretty much everything that happens on my iPhone and Apple Watch (music, photos, workouts, messages, etc) which makes it pretty easy make an entry.

Jury is out. I’ll let you know in a week or two.

Personalized Spatial Audio

I developed an appreciation for good audio headphones during my time at KBOA back in the ’70s. Not so much for the music as much as wanting/needing to hear my voice as the listener was hearing it. Too close to the microphone? Too far away? Popping my “P’s?” I’ve purchased a lot of headphones over the years, always searching for the perfect set.

I used Apple’s wired earbuds during the iPod days and thought the music sounded fine. I never purchased another set of “cans.” I purchased my first set of Apple AirPods in 2016 and have been using them ever since. They fit my ears perfectly. So well, in fact, that I forget I have them in. Unfortunately, that means I don’t notice if one falls out which happened last week. The Find My app on my phone told me where I last had the AirPod but I never found it.

When I got my new AirPods my buddy George asked if i had configured them to take advantage of Personalized Spatial Audio. I vaguely recalled reading about this feature but don’t think I ever used it. As good as I thought the AirPods were before, this bit of tech magic was impressive.

(Perplexity) It uses the TrueDepth camera on newer iPhones to scan your face and ears. This captures data about the geometry and contours of your head and ear position. This personalized profile is then used to optimize how spatial audio is rendered through your AirPods or Beats headphones. It adjusts the sound to account for your ear shape and head size, providing a more immersive 3D audio experience tailored specifically to your anatomy.

Songs I’ve listened to a thousand times sound fresh and new. So, yeah, I’ve become one of those people walking around with tiny white things in my ears. More and more I’m using Siri to schedule reminders, send messages, check the weather, jot down a note… you get the idea. My phone stays in my bag more these days and when at home, I leave it on the table and interact via AirPods (as long as phone and pods are on same wifi network.

This will get even more interesting this fall when I take the Apple Watch (with cellular) plunge.