Staying Up Zine is Live!

The Mobile AR Lab has supported the creation of “Staying Up” a Brooklyn based graffiti zine. Staying Up opens with local nyc graffiti legend Jesus Saves and a list of NYC’s best writers who are currently up.

During COVID the lab lost access to out regular equipment and decided to help in the creation of a zine that documented the 2020-present graf movement. The Lab had been doing AR graf research since 2018 and we decided to take it back and appreciate graf for graf because a historic moment was taking place in NYC Graf History and the lab was there to document it! The documentation tool form in a zine called Staying Up and has 2 forms- free and paid for. The first (paid for) issue was sold out in 18 hours! The Zines can be acquired at graffiti and skate shops across Brooklyn! We have big plans for Graffiti and AR and a number of collaborations in the works. More soon!

Want one?

Find a graf (or skate/bmx) shop in Brooklyn and if u are lucky there will be a free copy waiting for you ^^.

Lens Studio Segmentation Guide Tutorial 2021 — Mobile AR Lab @ NYU

Lens Studio Segmentation Guide

Lens Studio segmentation textures can isolate specific elements that your camera sees and render them out as a new texture to be used elsewhere in your Lens.

To get and set a specific piece of what the camera sees, follow these steps.

Step 1: Add Screen Image & Duplicate Camera

Open a New Project in Lens Studio.

In the Objects panel, click + to add a new Screen Image. Set the image component stretch mode to Stretch in the Inspector.

Next, right click and duplicate the camera. Turn it off for now.

Make sure to switch the Stretch Mode to fit on the image component in the inspector!

Step 2: Add Resources

Navigate to the Resource panel and add 3 new resources. Right click -> Add New or click + and start typing to search.

1: Segmentation Texture ( 10 options )
2: An Unlit material
3: A new Render Target

Step 3: Set Material Specifications

Select the Unlit material and view its components in the Inspector pane.

• In the Color Mask component, switch on the Alpha Channel.
• Switch the Blend Mode to Normal.
• Toggle the Opacity Texture
• Set the Opacity Texture by dragging on the Segmentation Texture from the Resources pane.
• Select the Screen Image in the Objects pane and assign the Unlit material to the Screen Image‘s Material.

You will now see a white, alpha channel mask updating over the area defined by the segmentation texture.

Back in the Resources pane, drag the Device Camera Texture to the Screen Image component’s Texture, replacing the white texture with what the camera sees.

Step 4: Setting up the Render Target

Now it’s time to render that alpha information out to a new texture that we can manipulate however we please.

Open the Render Target 2 located in the Resources Pane.

Navigate to the Inspector to modify the Render Target 2‘s properties.

• Set Clear Color to Color.
• Open the Color Box and drag Opacity to 0%.
• In the Objects pane, select the Orthographic Camera and navigate to the Inspector.
• Set the Render Target to Render Target 2 by dragging it in from the Resources pane.

Step 5: Setting up the Segmentation Layer

• Select the Orthographic Camera in the Objects Panel.
• In the Inspector’s Camera component, click the Layers box, then + Add New Layer.
• In the same window, click to activate the new, yellow Layer 1 and deactivate the green Orthographic Layer by clicking it off. Then click out to close the Layers window.
• Back in the Objects pane, select the Orthographic Camera’s Screen Image 0, and in the upper right of the Inspector pane, you can see that the Screen Image‘s layer is still set to Orthographic. Switch it to the newly created Layer 1.

Step 6: Reactivate Duplicate Camera/Assign Segmentation to Screen Image Texture.

Now you can reactivate your Orthographic Camera 2.

• Reactivate the Orthographic Camera 2 that you created in Step 1.
• Click the disclosure box next to Orthographic Camera 2 to open the Full Frame Region 0 and locate the Screen Image 0.
• From the Resources pane, drag the Render Target 2 to the Screen Image 0‘s Texture.

Now you can use the Screen Image to display just the area of segmentation. Try using the Tween Manager and making lots of duplicates for neat effects. The Segmentation Texture can also be used as a texture in Particle systems.

taught by Christopher Strawley

Lens Studio Segmentation Guide

Behavior Scripts Lens Studio Tutorial __Mobile_AR_Lab__NYU

Calculating Distance with Behavior Scripts in Lens Studio

Open a New Project in Lens Studio

We will be using this Lens to calculate the distance between the user and a 3D object placed in the world.  That distance can then be used to change the output of objects in your scene.

Click the Project Info button at the top-left of the screen and Uncheck the Front Facing Camera option.  We will be using this Lens as a window into a 3D augmented world with our phones as the viewport.

In the Objects panel, click the + create a new Text3D component.

Click + in the Objects Panel and search for World Object Controller under

Helper Scripts.

Click the disclosure button to the left of the newly created World Object Controller and delete the example FBX Object.

Replace the deleted resource with the 3D text component and drag it to the top of the World Object Controller’s hierarchy.

We are going to add a component that will allow device tracking.  Select the Camera in the Objects panel.  Look for the Inspector and click the +Add Component button to search and add Device Tracking.  It will now be added as a component of the Camera.

Back in the Objects panel, click the + to search and add a Behavior Script.  It is located under Helper Scripts -> Behavior Script.  Drag the Behavior Script over the World Object Controller object to make it a child of the World Object Controller.

In the Scene panel, use “W”, “E”, and “R” to activate the transform gizmos.

“W” – Position

“E” – Rotation

“R” – Scale.

Then move and scale the Text3D so that it sits on top of the green Ground Grid in the scene.

Be sure that your cursor is in the scene panel when you use these shortcuts to avoid renaming your Objects.  

Select the newly created Behavior script and in the Inspector, switch the trigger to Distance Check.

Underneath Trigger, set Object A to the World Object Controller.  Next, set Object B to the main Camera object that has the Device Camera Tracker component.

Set the Compare Type to Is Less Than and provide a unit of distance to compare.  This example uses 50.0.  You may have to play with these numbers depending on your space.

Under the Response Type drop down box, select “Send Custom Trigger” and set the Trigger Name to “close”.

Then right click on the Behavior script component and duplicate it.

Paste the copied Behavior script below and set the Compare Type equal to Is Greater Than and rename the Trigger Name to “far”.

Now create a new Script in the Resources Panel and attach it to the Behavior Script object.  Drag it below where our two Distance Check Behavior Scripts are located.

Double click the newly created Script in the Resources window to open the Script Editor. Start by adding a new component input with the following code. This will allow you to drag in the 3D Text Scene Object to be changed within our script.

//—–JS CODE—–

//@input Component.text3D words

The //@input adds access to any type of Lens Studio component.  So to access a Text3D component, you would have to declare the type (Component.text3D) and then name the variable you’ll be referring to in your project (words).  Save the script and there will be an input on your script.

The Behavior Scripts that we are using to check distance have a method to receive the custom triggers they are outputting when the Distance Check conditions are met.  You can double click the Behavior Script in the Resources panel and parse through the various methods included in the Behavior Script  For our case, we are interested in addCustomTriggerResponse() method.

Go to the Resources pane and open the Scripts Folder under the World Object Controller to double click and expose the code of the Behavior Script.

Let’s extract this line and copy it into the script with the 3D Text component so we can make changes when either of the distance check conditions are triggered.

 This method looks for the custom trigger with the same name as the “triggerName” and runs the function callback when that trigger is active. As you are bringing in the distance check behavior scripts, make sure to match the triggerName to the trigger name from the Behavior scripts. This value is passed in as a “string”.

Then set up your callback functions in the same script. For simplicity, you can name your callback functions the same as your custom trigger names. Because we have the 3D Text component input, we can set the text for the 3D object with either of these functions that are called when the custom trigger is sent.

Example Code:

//@input Component.Text3D words

global.behaviorSystem.addCustomTriggerResponse(“close”, close);

global.behaviorSystem.addCustomTriggerResponse(“far”, far);

function close() {

   script.words.text = “close”;


function far() {

   script.words.text = “far”;


Finally, preview the effects in the Preview Pane by selecting the environment in the upper part of the window.

You can also pair your Snapchat account and send the Lens to your device.

The Text3D Component is just one of many possibilities able to be controlled with Behavior Scripts.  Look through all of the Triggers on the Behavior script to see what else you can control!


snapchat, lensstudio, augmentedreality, ar, nyu, idm, integrateddesignandmedia, newyorkuniversity, nyuar, mobilearlab, tutorials, script, javascript, behavior, filters, arprojects, easyarprojects, easyar, augmented, 3d, condition, distancecheck, seestrawrun, christopherstrawley, augmentedrealitytutorials, mark skwarek

Behavior Scripts Lens Studio__Mobile_AR_Lab__NYU

by Christopher Strawley

MARS Mobile Augmented Reality Show AR 2021 is LIVE! 12/17/21 – till infinity

MARS Mobile Augmented Reality Show AR 2021 is LIVE @ the Damon BakAR Gallery! Show dates – 12/17/21 – till infinity.

How to see it-

Get SnapChat. Scan the code. See the 2021 AR show!! You need 2 phones to really experience it. See documentation-


Arwa Alsaati / Katelynn Browne / Anastasia Green / Vesper Guo / Halley Yue Huang / Vasu Kuram / Rachael Lu / Marketa Mala / Shehara Ranasinghe / Tanvi Sharma / Christopher Strawley / Ni Ni Than / Kristian Zadlo /

The DAMON (Loren) BakAR(er) Gallery Grand Opening Ribbon Cutting 2021!

The DAMON (Loren) BakAR(er) Gallery Grand Opening Ribbon Cutting 2021! It happened! We cut the ribbon and the gallery is finally open!!!!

This Gallery is dedicated to Damon Loren Baker, Pioneer of the field of Augmented Reality Art (among many other things).

Damon is responsible for some of the first works of augmented and virtual art. He often did this with sound. The Damon BakAR Gallery is an augmented reality gallery that can be seen by people w a smartphone and a proper net connection. Damon pioneered this very form of art. This gallery will only show Augmented Reality Art that pushes what the medium can do. Damon’s pioneering work in AR will now live on through shows and live events (sound anyone???) of future generations of AR creators. We plan to have regular shows so please check back regularly.

The DAMON (Loren) BakAR(er) Gallery is run by NYU’s Mobile AR Lab. Damon was instrumental in setting up The Mobile AR Lab and he was a Resident with the lab where we created the world’s first “Augmented Ensemble”.

This is the beta build of the gallery and we plan to build this out a bit but it works and is live now!!! Our show is live when all the other NYU shows were shut down due to COVID Omicron. The future is uncertain and the timing of this gallery could not be better!

The DAMON (Loren) BakAR(er) Gallery actively accepts AR ART curatorial proposals. We love AR (sound) + all AR art so feel free to reach out.


Video Soundtrack: Performance by Artist: Damon Loren Baker

Song: Birth Pangs [excerpt]

Recorded at Gerrard Art Space (Eclec~Tic~Toc Fest, Vol. 1 – Night 2), July 15, 2017. http://mechanicalforestsound.blogspot…

Damon Loren Baker TWITTER…

The Mobile AR Lab @ NYU run by Mark Skwarek Apply for shows here

Damon BakAR Gallery Opens w MARS Show! 12/7/21 -1/7/22 (and the shows up indefinitely!)

The Mobile AR Lab is back and we are kicking it off w the launch of our very own AR gallery; “The Damon BakAR Gallery” which, to be clear, is an AR gallery and can be seen around the world by smartphone uses. We do have shows in IRL gallery spaces but… that’s not really the point. Our first show is MARS [mobile augmented reality show 2021] and will showcase the some of the very best AR at NYU! This groups show includes artists

Arwa Alsaati / Katelynn Browne / Anastasia Green / Vesper Guo / Halley Yue Huang / Vasu Kuram / Rachael Lu / Marketa Mala / Shehara Ranasinghe / Tanvi Sharma / Christopher Strawley / Ni Ni Than / Kristian Zadlo /