- The Prohuman
- Posts
- Snapchat new Lens Studio lets anyone build AR effects
Snapchat new Lens Studio lets anyone build AR effects
Bonus: A complete 3 hours masterclass on how to build AI agents for free (only 50 of you can join)
Welcome, Prohumans.
Here’s what you’re going to explore in this post:
Snap: New AR effects
Amazon AI is taking over…quietly
Just happened in AI
Bonus:
Build AI Agents and 10x your productivity | No coding 🔥
Join this power-packed 3-hour AI Masterclass and learn how to Automate Workflows, Supercharge Productivity, and Unlock New Revenue Streams with cutting-edge AI tools used by top professionals and Master AI Agents.
Zero Coding Required!
If you’re still figuring out how to use AI in your career, you’re already behind.
What You’ll Gain in Just 3 Hours:
Automate repeated and boring tasks
Master 5+ AI tools & hacks to leverage AI
Learn Agentic AI, Workflow and Implementation
AI Agents, sub-agents, and tools design & architecture
BONUS: Get exclusive Gen AI templates & workflows
Date: 7th June, Saturday | Time: 10 AM EST
FREE for the first 50 professionals (Worth $399)
Snap just made AR creation as easy as texting
Snap is putting AR tools into everyone’s pocket. With a new iOS app and web version of Lens Studio, the company is turning AI-powered lens creation into something anyone can try.
Snap launched new Lens Studio apps for iOS and the web; no download required. Users can now build AR effects using text prompts, templates, and Bitmojis.
These tools are built for beginners, not just professional developers.
Snap still supports its full-featured desktop studio for advanced users. The company has 400,000+ AR creators but wants to open the door to millions more.
Meta’s shutdown of Spark AR left a gap; Snap is moving quickly to fill it.
New pro tools now support Bitmoji games, with turn-based systems and character controls.
Snap isn’t just keeping AR alive, it’s democratizing it. Most tech companies are pulling back from ambitious consumer tools. Snap is betting the opposite: that everyday people want to create, not just consume. If they’re right, Lens Studio could become the Canva of AR.
Amazon is turning warehouses into thinking machines

Amazon just pulled back the curtain on how it’s baking AI deep into its logistics and delivery network.
The company unveiled agentic AI-powered robots that can handle multiple warehouse tasks, not just one.
These new systems respond to natural language and adapt to high-pressure moments like holiday rushes.
AI-generated maps are helping delivery drivers navigate complex locations with better precision.
Amazon confirmed it’s exploring smart eyeglasses for drivers, offering hands-free, heads-up navigation.
Geospatial tech is already live in the U.S., especially helpful in large apartment complexes.
AI also helps Amazon stock products locally by forecasting demand based on region, price, and even weather.
The goal: faster delivery, less waste, and a leaner carbon footprint.
Amazon isn’t playing catch-up in the AI race; it’s quietly leading where it matters most: execution. While others talk AGI, Amazon is turning warehouses into intelligent ecosystems. If this scales, the future of logistics might look less like a fleet of trucks and more like a coordinated system of smart agents, each doing just enough, just in time.

Thanks for reading…
That’s a wrap.
What's on your mind?
Share your best ideas with us at theprohumanai@gmail. com
We'll bring your ideas to life. Send them our way, and we'll get to work on making them a reality.
Did you find value in our newsletter today?Your feedback can help us create better content for you! |
![]() | I hope this was useful… If you want to learn more then visit this website. Get your brand, product, or service in front of 700,000+ professionals here. Follow us on 𝕏/Twitter to learn more about AI: |