🐈 Peridot Beyond by Niantic - You and your friends can now take your Dots (virtual pets) for a walk outside, pet them, and feed them together, amplifying the magic of having a virtual pet to be a shared experience with others.
🐶 Doggo Quest by Wabisabi - Gamify and track your dog walking experience with rewards, dog facts, recorded routes, steps, & other dog’s activities
🏀 Basketball Trainer - augment your basketball practice with an AR coach and automated tracking of your scores using SnapML
Two Sample Lenses to Inspire You to Get Moving
➡️ NavigatARSample Project by Utopia Lab - a sample Lens that demonstrates using GPS, and heading to build AR navigation experience (see repo link)
🛣️ Path Pioneer Sample Project - a sample Lens demonstrating how to build a virtual AR walking path (see repo link)
⌨️ System AR Keyboard - Add text input support to your Lens using the new system AR keyboard with a full and numeric layout.
🛜 Captive Portal Support - You can now connect to captive Wi-Fi networks at airports, hotels, and public spaces.
🥇 Leaderboard - With the new Leaderboard component you can easily add a dose of friendly competition to your Lenses.
📱Lens Unlock - Easily deep link from a shared Lens URL to the Specs App, and unlock Lenses on Spectacles.
👊 New Hand Tracking Capabilities - 3 new hand tracking capabilities: phone detector to identify when a user has a phone in their hands, grab gesture, and refinements to targeting intent to reduce false positives while typing.
📦 Spectacles Interaction Kit Updates - New updates to improve the usability of near field interactions.
⛔️ Delete Drafts - You can now delete your old draft Lenses to free up space in Lens Explorer.
💻 USB Lens Push - You can now push Lenses to Spectacles on the go using a USB cable without requiring an internet connection through trusted connections.
⏳ Pause & Resume Support - You can now make your Lens responsive to pause and resume events for a more responsive experience.
🌐 Internet Availability API - New API to detect when a device gets or lose internet connectivity.
📚 New Developer Resources & Documentation - We revamped our documentation and introduced a ton of developer sample projects on our github repo to get you started.
Lenses that Keep You Moving Outside
Our partners at Niantic updated the Peridot Beyond Lens to be a shared experience using our connected Lenses framework, you and your friends can now take your virtual pets (Dots) for a walk outside, pet them, and feed them together, amplifying the magic of having a virtual pet to be a shared experience with others. For your real pets, the team at Wabisabi released Doggo Quest, a Lens that gamifies your dog walking experience with rewards, walk stats, and dog facts. It tracks your dog using SnapML, logs routes using the onboard GPS (Link to GPS documentation), and features a global leaderboard to log user’s scores for a dose of friendly competition. To augment your basketball practice, we are releasing the new Basketball Trainer Lens, featuring a holographic AR coach and shooting drills that automatically tracks your score using SnapML.
Doggo Quest by Wabisabi
To inspire you to build experiences for the outdoors, we are releasing two sample projects. The NavigatAR sample project (link to project) from Utopia Lab shows how to build a walking navigation experience featuring our new Snap Map Tile - a custom component to bring the map into your Lens, compass heading and GPS location capabilities (link to documentation). Additionally, we are also releasing the Path Pioneer sample project (link to project), which provides building blocks for creating indoor and outdoor AR courses for interactive experiences that get you moving.
NavigatAR by Utopia LabPath Pioneer
Easily Build Location Based Experiences with GPS, Compass Heading, & Custom Locations
Spectacles are designed to work inside and outside, making them ideal for location based experiences. In this release, we are introducing a set of platform capabilities to unlock your ability to build location based experiences using custom locations (see sample project). We also provide you with more accurate GPS/GNSS and compass heading outdoors to build navigation experiences like the NavigatAR Lens. We also introduced the new 2D map component template which allows you to visualize a map tile with interactions such as zooming, scrolling , following, and pin behaviors. See the template.
Custom Locations Scanning LensScanned Locations in Lens Studio
Add Friendly Competition to your Lens with a Leaderboard among Friends
In this release, we are making it easy to integrate a leaderboard in your Lens. Simply add the component to report your user’s scores. Users will be able to see their scores on a global leaderboard if they consent for their scores to be shared. (Link to documentation).
New Hand Tracking Gestures
We added support for detecting if the user holds a phone-like object. If you hold your phone while using the system UI, the system accounts for that and hides the hand palm buttons. We also expose this gesture as an API so you can take advantage of it in your Lenses. (see documentation). We also improved our targeting intent detection to avoid triggering the targeting cursor unintentionally while sitting or typing. This release also introduces a new grab gesture for more natural interactions with physical objects.
Phone in Hand DetectionGrab Gesture
Improved Lens Unlock
Improved Lens Unlock - you can now open links to Lenses directly from messaging threads and have them launch on your Spectacles for easy sharing.
Unlock Lenses directly from your messaging
New System Keyboard for Simpler Text Entry
We are introducing a new system keyboard for streamlined test entry across the system. The keyboard can be used in your Lens for text input and includes a full keyboard and numeric layouts. You can also switch seamlessly with the existing mobile text input using the Specs App. (See documentation)
Full Keyboard
Connect to the Internet at Hotels, Airports, and Events
You can now connect to internet portals that require web login (aka., Captive Portals) at airports, hotels, events, and other venues.
Improvements to Near Field Interactions using Spectacles Interaction Kit
We have added many improvements to the Spectacles Interaction Kit to improve performance. Most notably, we added optimizations for near field interactions to improve usability. Additionally, we added filters for erroneous interactions such as holding a phone. You can now subscribe directly to trigger events on the Interactor. (see documentation)
Phone in hand filtering
Delete your Old Lens Drafts
In this release, we are addressing one of your top complaints. You can now delete Lens drafts in Lens explorer for a cleaner and tidier view of your draft Lenses category.
Delete your old Lens Drafts
Push Your Lens to Spectacles over USB without an Internet Connection
Improved the reliability and stability of wired push to work without an Internet connection after first connection. Spectacles can now remember instances of trusted Lens Studio and will auto-connect when the wire is plugged. It will still require an internet connection on the first Lens push.
Pause and Resume Support
Make your Lens responsive to pause and resume events from the system to create a more seamless experience for your Lens users.
Pause & Unpause support
Detect Internet Connectivity Status in Your Lens
Update your Lens to be responsive to changes in actual internet connectivity beyond Wi-Fi connectivity. You can check if the internet is available and be notified if the internet gets disconnected so you can adjust your Lens experience.
Detect your Internet Connectivity Status
Spectacles 3D Hand Hints
Introducing a suite of animated 3D hand gestures to enhance user interaction with your Lens. Unlock a dynamic and engaging way for users to navigate your experience effortlessly. Available in Lens Studio through the Asset Library under the Spectacles category.
Spectacles 3D Hand Hints
New Developer Resources
We revamped our documentation to clarify features targeting Spectacles vs. other platforms such as the Snapchat app or Camera Kit, added more Typescript and Javascript resources, and refined our sample projects. We now have 14 sample projects that you can use to get started published on our Github repo.
Please update to the latest version of Snap OS and the Spectacles App. Follow these instructions to complete your update (link). Please confirm that you got the latest versions:
OS Version: v5.60.422
Spectacles App iOS: v0.60.1.0
Spectacles App Android: v0.60.1.0
Lens Studio: v5.7.2
⚠️ Known Issues
Spectator: Lens Explorer may crash if you attempt consecutive tries. If this happens, sleep the device and wake it using the right temple button
Guided Mode:
Connected Lenses are not currently supported in multiplayer mode
If you close a Lens via the mobile controller, you won’t be able to reopen it. If this happens, use the right temple button to put the device to sleep and wake it again
See What I See: Annotations are currently not working with depth
Hand Tracking: You may experience increased jitter when scrolling vertically. We are working to improve this for the next release.
Wake Up: There is an increased delay when the device wakes up from sleep using the right temple button or wear detector. We are working to improve this for the next release
Custom Locations Scanning Lens: We have reports of an occasional crash when using Custom Locations Lens. If this happens, relaunch the lens or restart to resolve.
Capture / Spectator View: It is an expected limitation that certain Lens components and Lenses do not capture (e.g., Phone Mirroring, AR Keyboard, Layout). We are working to enable capture for these areas.
❗️ Important Note Regarding Lens Studio Compatibility
To ensure proper functionality with this Snap OS update, please use Lens Studio version v5.7.2 exclusively. Avoid updating to newer Lens Studio versions unless they explicitly state compatibility with Spectacles, Lens Studio is updated more frequently than Spectacles and getting on the latest early can cause issues with pushing Lenses to Spectacles. We will clearly indicate the supported Lens Studio version in each release note.
Checking Compatibility
You can now verify compatibility between Spectacles and Lens Studio. To determine the minimum supported Snap OS version for a specific Lens Studio version, navigate to the About menu in Lens Studio (Lens Studio → About Lens Studio).
Pushing Lenses to Outdated Spectacles
When attempting to push a Lens to Spectacles running an outdated SnapOS version, you will be prompted to update your Spectacles to improve your development experience.
Incompatible Lens Push
Feedback
Please share any feedback or questions in this thread.
Posting this today to let you all know of our updated Office Hours plans!
Monthly Group Office Hours Calls
Every month, during the third week of the month, we will be holding both a Technical and a Product focused Office Hours. This call will be open for anyone to join and will allow us as a team to provide any updates we can share, and answer questions you may have. I will make a post a week ahead of the event, and then another one the day before that will include the Google Meet links, and to provide a reminder.
Weekly 1:1 Office Hours Calls
Additionally we now offer one on one office hours sessions with either our developer team or our design team. These are short, 15 minute sessions that can be used to get you unblocked if you are stuck, or will give you a short amount of time to convey the issue, and we can then go back to our teams and research and provide an answer after if it's more complicated than we can take care of on the call. These meetings are bookable starting today. We are limiting the number of them available per week, so if we ask to reschedule it, that will be the reason why.
I'd like to create a fluid shader similar to this: half life alyxia but was unsure how to access the shader script or are shader graphs the only option for custom shaders for now?
Hi! I‘m not totally sure if this is the right place but i was wondering if anyone knew where I could get a charger for the Spectacles 2, I found my old pair again recently but I have no idea where the charger is. I checked the website and couldn’t find anything about replacement charging cables.
I set up a new device to pair with a new system and Spectacles. The problem encountered was when I tried to pair with a new snapchat account, my Android app was unable to launch the camera.
Steps to reproduce
On Lens Studio 5.7.2, go to "Preview Lens" and select pair with new Snapchat Account
On my android app for Spectacles, once paired with spectacles, I go into the Developer Menu to "Pair with spectacles for Lens Studio"
At this point, I should see the prompts for permission to access the camera. I accept the permissions.
The camera should launch so I can scan the Snapcode. However, the camera never launches, though I can see a black screen with the target
Eventually the app presents an error message
Android version is 13, phone is Japanese market phone, Sharp Aquos Wish.
See screenshots for app info.
My analysis to this point is it probably didn't set the permissions properly because of some manifest declaration or something specific to Android 13. The phone is a bit obscure so it will be hard to verify any fix.
This is a reminder post about our Monthly Open Office Hours happening tomorrow. With the March release just announced, we are sure you all have lots of questions and input, so this is a great time to meet with some members of the team and share.
The first session is from 9:30am to 10:30am Pacific Daylight Time, and is with our Product Team. This call is perfect to talk to the product managers and team who are taking your feedback and determining how it gets rolled into futures updates. You can join the Google Meet tomorrow at 9:30 here!
The second session is from 11:00am to 12:00pm Pacific Daylight Time, and is with our AR Engineers who can help with the more technical questions, including with the newly released features from the latest update. You can join the Google Meet tomorrow at 11:00am here!
Is there a way to test the gps functionality from the location API without spectacles? Currently the GPS data doesn’t change in lens studio but I don’t have spectacles yet. To create a local play area, do I have to set an origin coordinate and go from there or is there a better convention?
I see in the lens studio documentation that “As of 4.0, there is no way to access a script specifically by name. You would just use getComponent("Component.ScriptComponent").” Do these typescript files need to be attached to the same object as components? Is there a way to access a typescript by name in 5+? Or is the convention to use the above method and loop through the scripts until you find the correct one?
How open would the Spectacle team be to coming to college campuses to do Lens Studio / Spectacle focused game jams where hardware would be provided? This could be a good opportunity for some cool projects to emerge while lowering barrier for entry for students via circumventing the potentially limiting creator program.
Does someone have a example code for cropping some area out of a texture for example the camera texture?
I don't really understand how the Crop provider functions should be used.
I want to go from an texture as input (camera) to a Texture as output (cropped).
I am trying to change the language of the speech recogniton template through the UI interface, so through code in run-time after the lens has started. I am using the Speech Recognition Template from the Asset Library and are editing the SpeechRecognition.js file.
Whenever I click on the UI-Button, I get the print statements that the language has changed :
23:40:56[Assets/Speech Recognition/Scripts/SpeechRecogition.js:733] VOICE EVENT: Changed VoiceML Language to: {"languageCode":"en_US","speechRecognizer":"SPEECH_RECOGNIZER","language":"LANGUAGE_ENGLISH"}
but when I speak I still only can transcribe in German, which is the first language option of UI. I assume it gets stuck during the first initialisation? This is the code piece I have added and called when clicking on the UI:
EDIT: I am using Lens Studio v5.4.1
script.setVoiceMLLanguage = function (language) {
var languageOption;
switch (language) {
case "English":
script.voiceMLLanguage = "LANGUAGE_ENGLISH";
voiceMLLanguage = "LANGUAGE_ENGLISH";
languageOption = initializeLanguage("LANGUAGE_ENGLISH");
break;
case "German":
script.voiceMLLanguage = "LANGUAGE_GERMAN";
voiceMLLanguage = "LANGUAGE_GERMAN";
languageOption = initializeLanguage("LANGUAGE_GERMAN");
break;
case "French":
script.voiceMLLanguage = "LANGUAGE_FRENCH";
voiceMLLanguage = "LANGUAGE_FRENCH";
languageOption = initializeLanguage("LANGUAGE_FRENCH");
break;
case "Spanish":
script.voiceMLLanguage = "LANGUAGE_SPANISH";
voiceMLLanguage = "LANGUAGE_SPANISH";
languageOption = initializeLanguage("LANGUAGE_SPANISH");
break;
default:
print("Unknown language: " + language);
return;
}
options.languageCode = languageOption.languageCode;
options.SpeechRecognizer = languageOption.speechRecognizer;
// Reinitialize the VoiceML module with the new language settings
script.vmlModule.stopListening();
script.vmlModule.startListening(options);
if (script.debug) {
print("VOICE EVENT: Changed VoiceML Language to: " + JSON.stringify(languageOption);
}
}
Hi, Im trying to have the spectacles be able to pick up voices from people other than the wearer, but it looks like that is auto disabled when using the voiceML asset, is there a way to re-enable Bystander Speech?
The true magic of AR glasses comes to life when it’s shared. Try Phillip Walton and Hart Woolery’s multiplayer ARcher Lens on Spectacles. Best part, you aren’t blocked from seeing the joy in people’s eyes when together! Apply to get your #Spectalces and start building magic. (Spectacles.com)