Mark Zuckerberg just described my AI nightmare, and I’m terrified

Meta CEO Mark Zuckerberg laid out a vision for the future of AI at Siggraph 2024, and I'm already terrified.

Mark Zuckerberg just described my AI nightmare, and I’m terrified
Nvidia and Meta CEOs talking at Siggraph 2024. Nvidia

Mark Zuckerberg just freaked me out at Siggraph 2024, and it has nothing to do with Sweet Baby Rays. No, Zuckerberg, sitting down with Nvidia CEO Jensen Huang to muse on the future of generative AI, described a future in which your endless doom scrolling will get even more endless with AI-generated content built specifically for you.

I wish I had better words to describe this, but I should really leave it to Zuckerberg to say himself:

“With generative AI, I think we’re going to quickly move into this zone where, not only is the majority of the content that you see today on Instagram just recommended to you from stuff that’s out there in the world that matches your interests… I think in the future a lot of this stuff is going to be created with [generative AI] tools, too.

Get your weekly teardown of the tech behind PC gaming

“Some of that is going to be creators using the tools to create new content. Some of it, I think eventually, is going to be content that’s either created on the fly for you, or pulled together and synthesized through different things that are out there.”

I’m sure I’m not the only one who gets annoyed at the large AI popups slowly taking over Google and nearly every social media platform, but AI-generated content built for you on the fly is a big step beyond that. It’s not so much that these platforms want to keep you engaged — we know that’s the ultimate goal — but that it would be taking content you enjoy, ripping out the soul of the people who made that content, and coming up with something new. It’s an AI-generated blog post disguised as a social media post, and I don’t want to take part in that.

That’s not the future we’re in today, though. Zuckerberg was laying out a vision of the future, mainly on the back of Meta’s new AI Studio. This feature, which was unceremoniously announced with an off-hand comment during the chat, is built with Meta’s new Llama 3.1 model, and it allows users to discover different AI characters and creators to create their own AI for fans to interact with.

Zuckerberg says that one of the largest uses Meta has seen with Meta AI so far is for support. The executive says that users are interacting with the AI to role-play stressful situations in what he calls a “judgement-free zone.” There are a bunch of problems with interacting with an AI to deal with a stressful situation, and even more with users interacting with creator-made AI bots with the prevalence of para-social relationships online, but I guess this has been the direction we’ve been heading.

Still, you can create your own character on the AI Studio website, or in the Instagram app. Or you can just stick with Character.ai, which has done the same thing for quite some time.

Jacob Roach

Jacob Roach is the lead reporter for PC hardware at Digital Trends. In addition to covering the latest PC components, from…

AMD’s Ryzen 9000 CPUs were delayed for the most ridiculous reason

Pads on the bottom of the Ryzen 9 7950X.

It came as a shock last week when AMD revealed that it would be delaying its Ryzen 9000 CPUs by up to two weeks. We might have some insight into why AMD made that last-minute decision now. A review posted on BilliBilli shows the Ryzen 7 9700X labeled as a Ryzen 9 9700X -- a typo that also affected the Ryzen 5 7600X, which carried the same Ryzen 9 branding, according to Tom's Hardware.

AMD has yet to confirm why the chips were delayed, outside of an issue with packaging. The range of four CPUs was supposed to arrive on July 31, but AMD is splitting the launch now. The Ryzen 5 9600X and Ryzen 7 9700X are arriving on August 8, while the Ryzen 9 9900X and Ryzen 9 9950X are arriving on August 15. The fact that AMD is splitting up the launch lends some credibility to the idea that the delay was due to a typo on the lower-end models.

Read more

The MacBook notch has been redeemed

An Apple MacBook laptop with the macOS Ventura background wallpaper and the notch seen at the top of the display.

I’m still chugging along on an iPhone 12 Pro, which means I don’t get much hands-on time with the Dynamic Island on a day-to-day basis. Or at least I didn’t, that is, until I discovered a little app called NotchNook. But instead of bringing Apple’s pill-shaped notch utility to my iPhone, it instead lives on my Mac. And it has the potential to change how I use my Apple computers for good.

I’m not the only one at Digital Trends who is intrigued by NotchNook -- my colleague appreciated the concept when annoucned, too -- and the idea for the app is pretty simple. Hover your mouse pointer over your MacBook’s notch and you’ll see it expand slightly. Click the notch or do a two-finger swipe downward and the notch expands further, revealing a black box (the “nook”) containing controls for various apps and tasks.

Read more

AMD ripped off my favorite app — and I love it

Cyberpunk 2077 on the LG UltraGear Dual Mode OLED.

Just months of releasing AMD Fluid Motion Frames (AFMF), the company revealed the second version of the frame generation feature. Now infused with AI, AFMF 2 promises lower latency, better performance on low-end hardware, and "significant improvements" to image quality. Better yet, you don't have to wait for it. If you have a supported AMD GPU, AFMF 2 is available now through the latest Radeon Software driver.

There's a lot here, and it sounds strikingly similar to what we've seen with Lossless Scaling. I've written about Lossless Scaling in the past, which is a $7 Steam app that can add frame generation to any game. AMD clearly took some pointers from the utility. For starters, it's using a frame generation model that's been trained on machine learning, which Lossless Scaling also includes. Most significantly, AMD now includes a Performance mode to reduce the overhead of the frame generation on low-end hardware -- that's also a key feature of Lossless Scaling.

Read more