(upbeat music) – So we just got done with possibly one of the biggest Apple WWDC keynotes of all time, maybe, just because of the state of the world and what everyone expects from tech companies in 2024, which is AI at the front of what you do. Interestingly, with Apple this time, Apple had AI at the back half of what they did. Literally, the entire first half of the keynote, they didn’t say AI once, zero times. Then the second hour, they basically just went, "All right, yeah, we, we actually do have a lot of AI stuff to share with you." And they just spit it all out. And of course they did manage to brand it because of course they did. AI for them is now Apple Intelligence. But look, I watched and live tweeted the entire two-hour long keynote so you don’t have to. And I’ve gathered all my thoughts on all the most important announcements, so this is everything you need to know that’s new and some of my takes on it. So I’m gonna go in the same order that Apple went. I’m gonna start with VisionOS, and we’re already on, somehow, VisionOS 2.0 less than half a year after the Vision Pro first got announced. And there are some solid new features, but I don’t know if I would call this 2.0 worthy. It feels more like a 1.2 update, but, okay, whatever, we’ll go with it. So there’s a new wrist turn gesture to quickly see the time and your battery percentage, and then you can jump right from there into the control center, instead of having to look way up at the ceiling like you had to before every time. It kinda reminds me of the Oculus Quest. That is easy. That’s an upgrade. There’s a bunch of immersive media features, like they let you go through your old photos and use advanced machine learning to turn them into spatial photos, which is really interesting if it works. And there are also new tools to help people create spatial videos and immersive videos on cameras other than the iPhone, but then of course you can only view them on a Vision Pro. So who knows how valuable this will actually turn out to be. But then, this year, Mac mirroring will get way more resolution thanks to it now doing the foveated rendering on the Mac. So basically it frees up a ton of resources and it lets you do up to double the resolution. So you could do a gigantic ultra wide if you want it to be. That is way more room for windows and activities. Love that. And that’s pretty much it. Again, it’s not the biggest 2.0 update we’ve ever seen, but it does feel like there’s some nice updates in there, including some new environments and the ability to rearrange your home screen. Believe it or not, you couldn’t do that before. Now you can. But speaking of rearranging your home screen, iOS just got one of the biggest or at least most interesting updates in a long time. And I’m just gonna… If you’re an Android user, I’d say for the next like two minutes, three minutes, just go ahead and unplug your keyboard for a second and just relax. ‘Cause yes, we’ve had these features for a long time, but now it’s the iPhone’s time to shine, all right? So, officially, you will finally be able to put icons wherever you want on the iOS 18 home screen grid. And it was hilarious that this got a whole applause moment from the audience during the keynote as if they just invented a new feature. But fine, you’ve been waiting for a long time for this. Have your fun. But they didn’t stop there. They added a whole bunch of other customization features to the iPhone home screen too, and it started to get a little more interesting. See, there’s this new theme engine now that lets you go in and change the color of every icon and widget on your home screen to the same color. So you can match your wallpaper with your icons to frame it nicely and have matching colors everywhere. It’s a simple thing and we’ve been doing it on Android with launchers for years, but the thing about these is, before this moment, basically every iPhone home screen looked the same. We’ve been saying that. They’ve just given you the ability to go crazy and kind of ruin your home screen if you want to, and people seem to absolutely want to. I have seen some rough home screens people are making on Twitter. A lot of them just look terrible. And I think the main thing I noticed is it seems like there are apps that support this whole colored thing well, and then there are apps that don’t, that just looks like it tints the entire… It just literally applies a color cast to the entire icon on your home screen, which causes all kinds of legibility problems. Doesn’t look good. There are widgets that look horrible. It’s just unreadable. So yeah, you can make some pretty-looking home screens if you want, but you can also make some really ugly looking home screens. And I was trying to decide if I like this more or less than what Google did on the Pixels with Material You I think, technically, it’s better because it works on all of the icons. I don’t know how many setups I’ve tried where I wanna do the monochrome icons, but there are just a few of my apps that aren’t supported, so they end up looking kind of, like they stick out like a sore thumb. So at least Apple’s is letting you change all of the icons at once. But anyway, Apple’s also redesigned the Control Center to be fully customizable across multiple pages. So again, there’s a lot more going on here, and there are many more customizable shortcuts in more places, including finally replacing the flashlight and camera on the home screen without jail breaking, finally. And the list keeps going. Stop me if you’ve heard these before, but you can do hidden apps now inside your app drawer. So you can lock apps into a hidden folder that makes it easier than ever to hide Tinder from your wife. There’s also scheduled text messages in iMessage. Great, finally, they’re having some fun over there. There’s also text formatting. There’s messages via satellite. RCS support was very casually but briefly mentioned. The Photos app is redesigned. There’s also a game mode that now minimizes background activity and minimizes Bluetooth latency for peripherals. Great. And a new automatic categorization in the new Mail app. So you can tell, there’s been a whole bunch of stuff. I will definitely be doing a larger, more focused just iOS 18 video, so you can see everything that’s new. There’s plenty of little stuff that didn’t make it into the keynote, so I’ll do a whole deep dive just for that. Make sure you subscribe to be among the first to see that. I believe we are catching up to Apple in subscribers. It’d be kind of nice if we pass them. That’d be sick. So then AirPods got minor updates, like voice isolation during phone calls, spatial audio during games, and the ability to respond to a prompt just by nodding or shaking your head, taken right from the Sony headphones. I kinda like it though, controlling it without using your voice is nice. Then Apple TV gets this new feature where you swipe down on the remote at any point during a show, and it just shows you all the actors and character names of the people on the screen and any song that’s playing in that moment in real time. And there’s also a feature that boosts people’s voices so you can always hear them over the music. And then watchOS, they got this little training mode that sort of balances the trends in your training over time and evaluates how hard a workout was versus how hard it could have been. And then iPadOS. So this was supposed to be Apple’s big chance to really convince us that the iPad, it’s something special that’s got a little something to it that would make us convinced that we’d wanna get the newest one, the M4 one that just came out over just your phone or over a Mac. And I’ll say it did get one incredible feature, but it didn’t really do much more than that. So the iPad it, it gets all the stuff I talked about with the iPhone, right? Full home screen customization, customizable Control Center, the new Photos app, et cetera. It’s all great. There’s also a feature in SharePlay that lets you go in and actually remotely control someone’s iPad when they allow it, which seems like a small thing, but I promise you that is massive for family tech support. You know who you are. But the one massive actually impressive new feature on the iPad in iPadOS 18, and you think I’m joking, but I’m dead serious when I say this. It’s the Calculator app. It’s the new calculator. I promise you I was so ready to dunk on this. I had a tweet drafted up as they were talking about it. We finally brought the calculator to the iPad, and there were screenshots of it, and it looked just like the iPhone app. And I was like, "Really? This is what you were so hyped about?" I had that clip queued up ready of Craig Federighi being like, "Yes, we had to do something super special to finally bring that calculator to the iPad." But then they pulled the Apple pencil off the side and they pulled up what’s called Math Notes, and it was pretty sick. So now, on the iPad, you can actually write down equations in handwriting, and it will understand what you wrote, and it’ll answer the question and in your handwriting. And then if you adjust the equation or add new information or something, it automatically updates the answer, which was so sick. I don’t know why I’m so impressed with this, but it was really cool to play with and look at. So if that wasn’t impressive enough, it also supports variables. So you can have variables written all over the page. So look at this. It has G, and X, and A, and H, and then it’s labeled Y as the height here. So you can ask for an equation using these variables, and it can give you an answer. And if you want, you can do a y equals equation, and it’ll give you a graph with real time adjustment of any of the variables in your equation. I mean, I wish I was in sixth grade again, so this could be my full-time calculator. This is sick. Now, at this point in the keynote, they haven’t even actually said the word AI a single time yet. I think this is pretty clearly AI. This is like handwriting recognition. This is like semantic understanding of all the things happening on the page. It’s good AI. Well played, Craig, you did it, okay, but this is before the actual AI section. But also this wasn’t mentioned on stage, but this is also coming to the iPhone’s Calculator app as well. But then Mac OS. Mac OS has its newest version. It’s called Sequoia, and it does also get a nice little grab bag of new features. There’s automatic window snapping now finally built in. You don’t need a third party app to do it anymore. Sherlocked. There’s built in backgrounds for any app that uses the webcam feed. There’s also a new password app that basically just takes what was buried in the settings before and just makes it its own standalone app. So it’s actually on Mac, iPad, and the iPhone. But probably the biggest thing is iPhone Mirroring. So this is another continuity feature that basically just lets you wirelessly always see your iPhone on your Mac. It merges your notifications and audio with your Mac and lets you drag and drop things back and forth between your phone that’s sitting next to you and your Mac. I kinda wonder if this hits your phone battery harder than normal. They didn’t really say, obviously, so I’m gonna have to test this. But I imagine if you’re just sitting on a wireless charger, then it won’t matter. But I’ll be checking that out. But then we arrived. They did all the sections with all the other operating systems, and now we got to the part where Apple’s gonna talk about AI stuff, kind of. Now, I don’t know if you remember this, but a couple months ago, not that long ago, I did an entire video on all of these phrases that it seems like Apple refused to say on stage. They just would not say ’em, and AI is one of them. They just didn’t say it. They’d say anything else, neural nets, machine learning, all sorts of other things like that, But the times of the times, people wanna hear Apple talk about AI. So they finally did. And they also, like I said, they rebranded it just for themselves to just commander the AI name. And so now it’s called Apple Intelligence in Apple Land. So here’s basically what you need to know about Apple Intelligence, right? So Apple has already had AI features on their devices before. They’ve had the Neural Engine inside of their chips, and they’ve done things like, smartly cutting subjects out of photos. They’ve done things like auto complete on the keyboard. That’s already existed. But this new Apple Intelligence is basically a bucket of these new generative models and what they do on your devices. So there’s these new diffusion models, generative models, and large language models that are all built by Apple that bring new functionality to the supported devices. And actually, I’ll just get that out the way, right off the bat. These are only supported on the highest end versions of currently available Apple silicon. So that just means iPhone 15 Pro and any iPad with M1 or later and any Mac with M1 or later. So what does that look like? Well, basically starting with these new OSs, there’s a small suite of tools that’s kind of sprinkled across everything. It’s not like there’s one Apple Intelligence app or something like that. They’re just kind of sprinkled throughout. So here’s an example, writing tools. We all know how powerful these large language models can be. So anytime you’re writing in pages or keynote or basically anywhere where there’s a cursor, you can use the writing tools to summarize or rewrite something that you’ve already written. It can change the writing style or just proofread, just basic useful stuff. Here’s another one. Remember the Magic Eraser tool in Google Photos and these Android phones that use it? Apple’s finally doing that too. Built into their own Photos App is a Clean Up tool, and it’s basically the same idea. This will identify background items. It’ll let you circle items you don’t like in your photos and just get rid of them. Fill in the background with generative fill, super quick. And there’s also Genmojis, generative emojis. And this made me feel really old ’cause I would never use this, but apparently there’s a whole group of people who go through searching for emojis and then they find that there’s one that doesn’t exist, and they’re like, "Dang, I wish I could just make an emoji for this moment right now." And for those people, yeah, now you can. You can generate a new emoji, You can literally type it in like a prompt, and the diffusion model will create that new emoji from scratch in the style of all the other emojis. And they had an entire thing called the Image Playground that’s built into a bunch of Apples apps, but also has its own separate app that lets you create these nice little square images with prompts in three different styles, sketch, illustration, and animation. And yes, there are also Siri improvements that’s using the large language models to generally understand context better and just generally be better and more natural at being an assistant that isn’t garbage. And later in the year, it will also be able to pull info from inside of apps and take actions inside of apps for you too. I remember when this was a huge Bixby feature on Samsung phones. So that’s cool to see. Plus there’s also this big pretty new full-screen animation when you’re triggering Siri to tie it all together. Its voice is apparently slightly updated and you can now also type to Siri instead of talking to it out loud every time, which is long overdue, but also nice. There is notes app summaries, there is phone call summaries, just all kinds of features sprinkled around like I said. I think I’ll end up making a video just summarizing or testing or reviewing all of the Apple Intelligence stuff ’cause it’s kind of all over the place, but now you know what it’s. But one question that’s been floating around the internet is, what stuff is happening on device versus things that have to go to the cloud? And this has also come up a lot because you might have also heard about Apple partnering specifically with OpenAI to integrate ChatGPT-4.0 into this Apple Intelligence stuff. So it’s a big question everyone’s wondering. So I got the official answers from Apple, and the answer is basically almost everything happens on device intentionally, and those are from Apple built models, so that should be the fastest stuff that happens. But in the chance that there are things that are too complex or just outside of the area of expertise of Apple’s models, then it can basically go one of two ways. One is it will go to a larger server based model that’s on servers that Apple has built with Apple silicon using what they call Private Cloud Compute. So basically the info is never stored or sent to Apple. It’s still gonna have the downsides of having to go off of the device up to the cloud. It might take a bit longer if you’re in an area with terrible internet. It might not work at all, but in the chance that this is like a big complex thing that could benefit from those models, that’s what it’ll try to do. But the other is when you specifically ask it for something that ChatGPT would be good at. And if that happens, then it will specifically ask you, "Hey, is it cool if I ask chat GPT for an answer to this question?" And then you can give it a yes or no on the spot. So anytime it wants to do this, whether it’s to upload a photo that you’re asking something about or just do a complex prompt in general, then this little dialogue box pops up and you have to say yes every time, and then you can tap into everything that OpenAI’s model is capable of, or generate even more realistic or varied, random different styles of images that Apple’s diffusion models would never make. This is all free without an account. At no point is OpenAI allowed to ever store any of these requests, and Apple has also said that they will obscure your IP address. So you can’t even… OpenAI can’t even connect multiple requests together to form a profile of you. There’s just a lot of thought that’s gone into ideally making this as secure and private of a version of going to the cloud for AI features as they possibly can. So yeah, it’s really interesting, a lot of interesting stuff going on here. I think my overall take with a lot of this WWDC announcements and with the Apple Intelligence stuff, first of all, is that the Humane Pin and the Rabbit R1, they were so doomed from the start. Even if they were good up to this point, there’s just no way they could be as good as this stuff on your phone with all the personalization, all the info that they already know about you. So I guess that’s confirmed, but also it does feel like we’ve entered a new age with Apple. I think the AI stuff is so important that it very much is what we’re most interested in and it overshadows. Honestly, what could possibly be your favorite non-AI feature from the past couple things? Moving home screen apps and icons everywhere, that might be the biggest thing, but it’s up to this point now all about what this intelligence stuff can actually do for you and what it can bring you. So that’s what I’m really excited about. I’ll be working on some videos very soon about a lot of this stuff. So definitely, like I said, get subscribed if you haven’t already, but this has been your overview or what you need to know. Thanks for watching. Catch you guys in the next one. Peace. (gentle music)
source