Cnet

Apple Offers Up to $1 Million to Anyone Who Can Hack Its AI Servers

K.Hernandez3 hr ago
Apple CEO Tim Cook waxed poetic about the arrival of some Apple Intelligence features for the iPhone 15 Pro, Pro Max and iPhone 16 last week, saying in a tweet that the intro of Writing Tools along with new cleanup features for photos and a more conversational version of its Siri voice assistant is "the beginning of an exciting new era."

But reviewers say Apple Intelligence isn't all that, at least not yet. CNET editor Bridget Carey reminds us the new generative AI features will be available in the US in only a limited way — you need to go to your iPhone settings to get on the wait list, and doing things like making your own emojis with AI (a feature Apple calls Genmoji ) will come later.

Meanwhile, CNET mobile reviewer Lisa Eadicicco said you shouldn't "expect your iPhone to feel radically different" and called the new features "a first step in what could hint at larger changes" later on. So far, she finds the message and notification summaries the most useful.

"What I've come to appreciate most is that I can look down at my phone after getting a barrage of texts or Slack messages and know whether it's an emergency just from the lock screen," she says in her iOS 18.1 early review . "The summaries aren't perfect (AI, as it turns out, can't nail sarcasm and doesn't know the inside jokes I share with my friends)," she adds. "But this type of functionality is exactly the type of passive, practical intelligence I'm hoping to see more of on smartphones in the future ."

Though Apple continues to roll out AI features slowly as part of what software chief Craig Federighi said last month is the company's strategy to "get each piece right and release it when it's ready," one thing Apple feels very confident about is how it's handling the privacy and security on the Private Cloud Compute, or PCC, servers that power some Apple Intelligence features.

That's why it's inviting hackers as well as privacy and security professionals and researchers to verify the security claims it's made about PCC and is offering bounties from $50,000 up to $1 million to anyone who finds a bug or major issue. PCC, Apple claims, is the "most advanced security architecture ever deployed for cloud AI compute at scale."

Here's a nontechnical explainer of PCC , and you can find a more technical one from Apple here . Bottom line: Apple guarantees it protects all the data on your iPhone by keeping it on the device (which is known as on-device or local processing). If a complex AI task needs to be handed off to more-powerful computers in the cloud — PCC servers running custom Apple chips — the company promises it'll use "your data only to fulfill your request, and never store it, making sure it's never accessible to anyone, including Apple."

"Because we care deeply about any compromise to user privacy or security, we will consider any security issue that has a significant impact to PCC," Apple says about the Apple Security Bounty reward program. "We'll evaluate every report according to the quality of what's presented, the proof of what can be exploited, and the impact to users."

Here are the other doings in AI worth your attention.

ChatGPT embraces search, in challenge to Google

Since it was released two years ago, OpenAI's groundbreaking ChatGPT AI chatbot has been used to write letters, emails, ads and wedding vows; provide summaries of long reports; and craft software code for the 250 million people who use it every week. But despite the chatbot's ability to respond to users' prompts in plain, everyday English, one thing it couldn't do well was provide answers to search queries with links to the web source — like Google does — since the training data in OpenAI's large language model wasn't being updated with the latest news.

That changed last week when the San Francisco startup delivered on its promise to evolve its popular gen AI chatbot into a gen AI search engine that could grab data for you from the web in real time. The new search functionality and up-to-date links are powered by OpenAI partner and investor Microsoft and its Bing search engine.

"You can get fast, timely answers with links to relevant web sources, which you would have previously needed to go to a search engine for," OpenAI wrote in an Oct. 31 blog post. "Getting useful answers on the web can take a lot of effort. It often requires multiple searches and digging through links to find quality sources and the right information for you. Now, chat can get you to a better answer: Ask a question in a more natural, conversational way, and ChatGPT can choose to respond with information from the web. Go deeper with follow-up questions, and ChatGPT will consider the full context of your chat to get a better answer for you."

What does it all mean? It's another step in the escalating battle between OpenAI and Google, which has been adding gen AI functionality on top of its search engine, with its Gemini chatbot. Meta is also reportedly working on an AI search engine.

They're all fighting for your attention — and a share of the money to be made in gen AI.

As for where OpenAI is getting those links to the latest news, it's signed licensing deals with publishers including the Associated Press, Axel Springer, Condé Nast, Dotdash Meredith, Financial Times, GEDI, Hearst, Le Monde, News Corp, Prisa (El País), Reuters, The Atlantic, Time and Vox Media, CNET's Imad Khan reported . Meanwhile, The New York Times is suing OpenAI and Microsoft for scraping its stories without permission.

As for other content providers concerned that the AI company is co-opting their work without permission or compensation, Khan noted that, "It's up to other news publishers if they want to have OpenAI's robots crawl their sites for information."

Meanwhile, OpenAI told Bloomberg News that three-quarters of its revenue comes from the 1 million paying subscribers (consumers and businesses) that it's signed up. But that isn't enough to fund its operations — the privately held company made history last month when it raised $6.6 billion , in one of the largest venture capital funding rounds in US history.

Google engineers are using AI to help write code

Software engineering as a profession isn't going anywhere anytime soon. But as I noted in a recent column about the future of jobs , it's a profession that will be changed by AI as tools will be used to write code more efficiently and effectively (and less expensively). I cited recent comments by executives at Amazon and Perplexity to make the point.

Now add Google to the list. During the company's earnings call last week, CEO Sundar Pichai said that more than 25% of all new code at Google is generated by AI, then reviewed and accepted by engineers.

"We're ... using AI internally to improve our coding processes, which is boosting productivity and efficiency," Pichai said. "This helps our engineers do more and move faster."

You can read all his remarks here . But I'll just repeat that if software engineers need to start rethinking what they do, then it's probably time we all reflect on how AI will change our jobs in the not-too-distant future.

Also worth knowing...

If you want to learn or hone your AI prompt-writing skills, Google has a new 10-hour, go-at-your-own-pace course that's offered through Coursera for $49. Called Prompting Essentials , it was developed by AI experts at Google and Google's DeepMind AI lab. The company says you don't need any prior experience with AI and prompting and that you'll be able to "build a library of reusable prompts." I thought this bit was interesting: Google experts say most prompts are too short. There are 21 words on average in a successful prompt, but "data shows user prompts are often shorter, containing nine words or less."

Oct. 30 marked the one-year anniversary of US President Joe Biden releasing his executive order on AI. The administration shared a list of the more than 100 actions completed by various federal agencies, but I'll note there's still no sweeping AI regulation in place in the US akin to what the European Union passed earlier this year as part of the EU AI Act .

0 Comments
0