🚀 Gate Square Creator Certification Incentive Program Is Live! 
Join Gate Square and share over $10,000 in monthly creator rewards! 
Whether you’re an active Gate Square creator or an established voice on another platform, consistent quality content can earn you token rewards, exclusive Gate merch, and massive traffic exposure! 
✅ Eligibility: 
You can apply if you meet any of the following: 
1️⃣ Verified creator on another platform 
2️⃣ At least 1,000 followers on a single platform (no combined total) 
3️⃣ Gate Square certified creator meeting follower and engagement criteria 
Click to apply now 👉 
Microsoft Open Source New Version of Phi-4: Inference Efficiency Rises 10 Times, Can Run on Laptops
Jin10 data reported on July 10, this morning, Microsoft open sourced the latest version of the Phi-4 family, Phi-4-mini-flash-reasoning, on its official website. The mini-flash version continues the Phi-4 family’s characteristics of small parameters and strong performance, specifically designed for scenarios limited by Computing Power, memory, and latency, capable of running on a single GPU, suitable for edge devices like laptops and tablets. Compared to the previous version, mini-flash utilizes Microsoft’s self-developed innovative architecture, SambaY, resulting in a big pump in inference efficiency by 10 times, with average latency reduced by 2-3 times, achieving a significant improvement in overall inference performance.