[Crypto World] NEAR Protocol has recently launched two pretty interesting new features—one called NEAR AI Cloud, a cloud service, and the other is Private Chat, a privacy-focused chat tool.
Let’s start with the AI Cloud service. Its special feature is that every request you send runs inside Intel TDX and NVIDIA confidential computing hardware. In other words, your data is processed in a sealed, isolated “black box” that no outsider can access. What’s even more impressive is that after each AI inference, the system generates an encrypted proof—a kind of digital receipt that confirms the model really ran as expected, without any tampering.
As for Private Chat, it basically wraps that cloud service into an everyday chat tool. Whether you’re asking casual questions or conducting in-depth research, it provides verifiable privacy protection. Put simply, it lets you use AI without worrying that your conversations will be snooped on or misused.
Looking at these two products together, NEAR is actually exploring a path of “AI + privacy computing”—making AI useful while ensuring user data security. In today’s world, where AI applications are everywhere but privacy issues are frequent, this direction is definitely worth paying attention to.
View Original
This page may contain third-party content, which is provided for information purposes only (not representations/warranties) and should not be considered as an endorsement of its views by Gate, nor as financial or professional advice. See Disclaimer for details.
12 Likes
Reward
12
7
Repost
Share
Comment
0/400
LightningClicker
· 4h ago
The logic of running AI in this sealed environment is truly impressive. With hardware support from Intel and NVIDIA, plus the ability to generate cryptographic proofs... it finally feels like there's something substantial here.
View OriginalReply0
StakeWhisperer
· 7h ago
Sealed hardware running AI? Sounds pretty impressive, but this privacy solution is definitely much better than what the big companies offer.
View OriginalReply0
AlwaysQuestioning
· 10h ago
Running AI in a sealed black box? Sounds pretty intimidating, but can this thing really be trusted?
NEAR’s move this time is pretty impressive—finally, someone is taking data privacy seriously.
As for those cryptographic proofs, I need to look at the source code to feel assured. Otherwise, at the end of the day, it’s still a matter of trust.
View OriginalReply0
AlwaysMissingTops
· 10h ago
Sealing data in a black box to run AI? This sounds like putting a security door on your data—does it really work?
The cryptographic proof part is pretty impressive; finally, there's a project brave enough to "insure" data.
Private chat is basically the same old thing in a new shell, but as long as it's convenient to use, it's fine.
With Intel and Nvidia hardware backing it up, this setup is definitely robust.
With both cloud services and cryptographic proofs, how do they calculate the costs here...?
This is exactly what Web3 should be doing—finally, someone is taking data privacy seriously.
It sounds great, but you have to try it to know how deep the waters really are.
View OriginalReply0
DeepRabbitHole
· 10h ago
Running AI in a sealed black box sounds intriguing, but can it really be trusted?
View OriginalReply0
DefiVeteran
· 11h ago
Sealed hardware running AI? Sounds plausible, but I'm just worried it's another marketing gimmick.
Anyway, I'll give it a try—private chat is still better than being eavesdropped on.
NEAR is really making privacy its selling point this time, that's something.
If the data runs in a "black box," it has to be truly sealed... otherwise it's pointless.
Encrypted proofs sound good, but let's see if they can actually deliver.
NEAR launches AI cloud services and private chat: What’s it like to run AI models in a sealed environment?
[Crypto World] NEAR Protocol has recently launched two pretty interesting new features—one called NEAR AI Cloud, a cloud service, and the other is Private Chat, a privacy-focused chat tool.
Let’s start with the AI Cloud service. Its special feature is that every request you send runs inside Intel TDX and NVIDIA confidential computing hardware. In other words, your data is processed in a sealed, isolated “black box” that no outsider can access. What’s even more impressive is that after each AI inference, the system generates an encrypted proof—a kind of digital receipt that confirms the model really ran as expected, without any tampering.
As for Private Chat, it basically wraps that cloud service into an everyday chat tool. Whether you’re asking casual questions or conducting in-depth research, it provides verifiable privacy protection. Put simply, it lets you use AI without worrying that your conversations will be snooped on or misused.
Looking at these two products together, NEAR is actually exploring a path of “AI + privacy computing”—making AI useful while ensuring user data security. In today’s world, where AI applications are everywhere but privacy issues are frequent, this direction is definitely worth paying attention to.