Quick start with a webcam


#1

Quick start with a webcam

If you want to try out PapARt for a first AR applicaton, this is the right place !

In this tutorial we install Processing and Papart, then try the first example application.

Requirements

Here you list the hardware / OS required: e.g.:

  • Hardware: Webcam, printer.
  • Operating system: Win/OSX/Linux.
  • Language: Processing
  • Time required: 0.5 hour to 2 hours.
  • Difficulty: easy.

Guide tested on: Windows 10 64bits, Linux x64 (Manjaro).

Step 1: Processing and libraries.

  1. You need first to install Processing for you system. Once it is installed you need to run it once so that it creates all the folders. Some libraries are

  2. Then download and install the PapARt library.

  3. PapARt requires additonal libraries here is the collection: link. It is used for math, network, 3D, and GUI. It includes well known Processing libraries: PeasyCam, OSCP5, Video, Toxiclibs. And custom ones: SVGExtended, ProcessingTUIO, Skatolo, GuiModes, and Reflections.

  4. You will also need the Video library, and for some examples sound, they are available through Processing: Sketch -> Import Library -> Add a Library or directly here.

  5. The last library is JavaCV, it embeds OpenCV (C++) pre-compiled. Consequenly you need to download the right package for your platform:

All these libraries must be installed in your $sketchbook/libraries/ folder:

  • Linux: /home/toto/sketchbook/libraries.
  • OSX /Documents/Processing/libraries.
  • Windows: /My Documents/Processing/libraries.
  1. Download the PapARt examples, and extract them in your $sketchbook folder.

You need to restart processing after installing a new library, or adding sketches.

Before going to the next step:

Start Processing, then go to menu File -> Examples. Then in the pop-up, expand the folder Contributed libraries. You should see most of the installed libraries: PeasyCam, PapARt, Toxiclibs, SVGExtended etc… If not check that you installed the libraries correcty. You can follow this official guide to install libraries.

Step 2: Libraries test.

Launch the example Sketchbook -> PapARt-examples -> first-examples -> Debug -> simple. Run the example. You might get warnings:

No library found for org.bytedeco.javacpp.opencv_core

Please ignore them, these is an issue open to fix this. It is just debug text.

Step 3: Camera test.

  1. Print the markerboard: You can find it in libraries\PapARt\data\markers\A4-default.pdf or online on github: download link.

  2. Launch the example Sketchbook -> PapARt-examples -> first-examples -> Camera -> SeeThrough. Run the example, and show the printed sheet in front of the camera.

You can download presets for many cameras and copy it to sketchbook/libraries/PapARt/data/calibration/cameraConfiguration.xml (right click, save as):

The configurations are discussed on the camera library usage thread.


Possible issues:

Q: The app crashed, it could not start the camera.
A: You need to check the cable, and choose the right driver, as explained on this thread.

Q: I have two cameras, and I want to use the other one.
A: You can specify the camera name, or number depending on the driver. It is explained on this thread.

Q: I did not try yet, is my camera supported ?
A: Yes most likely.

Conclusion

Now you have a fully working PapARt system for “see through” applications. You can try the
other “Camera” examples in Papart-examples -> First examples -> Camera. If you want a deeper view of PapARt possiblites you can look at PapARt -> features -> advancedUses. Most of the adanced examples use a single webam for compatibility.

Going further

AR applications in PapARt follow a particular design, the tutorials are here to help. We encourage you to read about the Papart helper, the display model, and the PaperScreen abstraction.
The next step is to calibrate your camera, a link to the calibration guide will be posted here.

About the author: Jeremy Laviole is the main developper of PapARt and promotes open source technologies for educational and professionnal uses at RealityTech.


PapARt 1.2 - First wide audience release
Orbbec Astra for MacOSX users
#2

Hello @Jiii,

I’ve followed your tutorial to install PapARt.
Unfortunately I have an error (not a warning) on Mac OS with JavaCV.

No library found for org.bytedeco.javacpp
Libraries must be installed in a folder named ‘libraries’ inside the sketchbook folder (see the Preferences window).


#3

Hi Philippe !

Did you install the all-architecture library ?

Either the library could not be found, or it could not be loaded, I guess the error message is the same.


#4

Yep, I followed step by step the tutorial.
I will test with a version of javacv compiled on my computer.
I will let you know the result :slight_smile:


#5

Is the installation good ? With Processing you should have this folder structure:

Processing/libraries/javacv/javacv.jar, and the other jars in this folder will be also loaded.

Do you run from the Processing IDE ?
Do you see the library in the IDE: Sketch -> Import library -> Javacv ?


#6

Ok, I’m moving a bit. It seems that the current naming convention “javacv-1.3.jar” instead of javacv.jar" is a problem ^^

I have now the following errors …

java.lang.UnsatisfiedLinkError: no jniopencv_core in java.library.path
at java.lang.ClassLoader.loadLibrary(ClassLoader.java:1867)
at java.lang.Runtime.loadLibrary0(Runtime.java:870)
at java.lang.System.loadLibrary(System.java:1122)
at org.bytedeco.javacpp.Loader.loadLibrary(Loader.java:804)
at org.bytedeco.javacpp.Loader.load(Loader.java:613)
at org.bytedeco.javacpp.Loader.load(Loader.java:530)
at org.bytedeco.javacpp.opencv_core.(opencv_core.java:10)
at fr.inria.papart.calibration.HomographyCreator.init(HomographyCreator.java:61)
at
fr.inria.papart.calibration.HomographyCreator.(HomographyCreator.java:56)
at


Caused by: java.lang.UnsatisfiedLinkError: no opencv_imgproc in java.library.path
at java.lang.ClassLoader.loadLibrary(ClassLoader.java:1867)
at java.lang.Runtime.loadLibrary0(Runtime.java:870)
at java.lang.System.loadLibrary(System.java:1122)
at org.bytedeco.javacpp.Loader.loadLibrary(Loader.java:804)
at org.bytedeco.javacpp.Loader.load(Loader.java:604)
… 27 more


#7

Do you have OpenCV installed with a package or manual compilation ? If so, uninstall it.

The OS will try to load your installed library, then load the compiled ones in JavaCV. If they are not the same, it creates a mess.


#8

So i uninstalled opencv and restart but same problems. I also replaced the one downloaded here by my own javacv compiled jar, but it’s the same error :frowning_face:


#9

What is the full dump ? Do you have log files also ?


#10

Here is the full log. (it goes over and over with each draw loop)

Starting a PaperTouchScreen. simple$MyApp
Font-family: 'Linux Biolinum’
Cannot start the tracking with cam: true, board: MarkerBoard /Users/somebody/Documents/Processing/libraries/PapARt/data/markers/A4-default.svg

java.lang.UnsatisfiedLinkError: no jniopencv_core in java.library.path
at java.lang.ClassLoader.loadLibrary(ClassLoader.java:1867)
at java.lang.Runtime.loadLibrary0(Runtime.java:870)
at java.lang.System.loadLibrary(System.java:1122)
at org.bytedeco.javacpp.Loader.loadLibrary(Loader.java:1215)
at org.bytedeco.javacpp.Loader.load(Loader.java:974)
at org.bytedeco.javacpp.Loader.load(Loader.java:873)
at org.bytedeco.javacpp.opencv_core.(opencv_core.java:10)
at fr.inria.papart.calibration.HomographyCreator.init(HomographyCreator.java:61)
at fr.inria.papart.calibration.HomographyCreator.(HomographyCreator.java:56)
at fr.inria.papart.procam.PaperScreen.computeWorldToScreenMat(PaperScreen.java:347)
at fr.inria.papart.procam.PaperScreen.pre(PaperScreen.java:231)
at fr.inria.papart.procam.PaperTouchScreen.pre(PaperTouchScreen.java:86)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at processing.core.PApplet$RegisteredMethods.handle(PApplet.java:1427)
at processing.core.PApplet$RegisteredMethods.handle(PApplet.java:1420)
at processing.core.PApplet.handleMethods(PApplet.java:1614)
at processing.core.PApplet.handleDraw(PApplet.java:2428)
at processing.opengl.PSurfaceJOGL$DrawListener.display(PSurfaceJOGL.java:859)
at jogamp.opengl.GLDrawableHelper.displayImpl(GLDrawableHelper.java:692)
at jogamp.opengl.GLDrawableHelper.display(GLDrawableHelper.java:674)
at jogamp.opengl.GLAutoDrawableBase$2.run(GLAutoDrawableBase.java:443)
at jogamp.opengl.GLDrawableHelper.invokeGLImpl(GLDrawableHelper.java:1293)
at jogamp.opengl.GLDrawableHelper.invokeGL(GLDrawableHelper.java:1147)
at com.jogamp.newt.opengl.GLWindow.display(GLWindow.java:759)
at com.jogamp.opengl.util.AWTAnimatorImpl.display(AWTAnimatorImpl.java:81)
at com.jogamp.opengl.util.AnimatorBase.display(AnimatorBase.java:452)
at com.jogamp.opengl.util.FPSAnimator$MainTask.run(FPSAnimator.java:178)
at java.util.TimerThread.mainLoop(Timer.java:555)
at java.util.TimerThread.run(Timer.java:505)
Caused by: java.lang.UnsatisfiedLinkError: no opencv_imgproc in java.library.path
at java.lang.ClassLoader.loadLibrary(ClassLoader.java:1867)
at java.lang.Runtime.loadLibrary0(Runtime.java:870)
at java.lang.System.loadLibrary(System.java:1122)
at org.bytedeco.javacpp.Loader.loadLibrary(Loader.java:1215)
at org.bytedeco.javacpp.Loader.load(Loader.java:959)
… 27 more


#11

Did you manage to compile javacpp-presets on your machine ?

You need to check the repository at 1.4:

Then compile the libraries:

mvn install --projects .,librealsense,libfreenect,libfreenect2,opencv,ffmpeg,artoolkitplus,libdc1394,flandmark

Once you have that, you can put all of thoses in /Users/somebody/Documents/Processing/libraries/javacv/library/.

You do not need to compile JavaCV, there is no native code in it.

If it all compiles, and work we can distribute it here for your specific OSX version.


#12

So … it’s working :slight_smile:
I just did an ugly copy and paste from this downloaded folder http://search.maven.org/remotecontent?filepath=org/bytedeco/javacv-platform/1.4/javacv-platform-1.4-bin.zip to
…/Processing/libraries/javacv/library/.

No errors so far with the example simple found in step 2

Launch the example Sketchbook -> PapARt-examples -> first-examples -> Debug -> simple.


#13

Great ! That’s exactly what was/should be done in the all-architecture distribution, I need to check it out.


#14

#15

Hi! I tried to follow the step 3, but seems like the code doesn’t recognize my webcam.

Starting a PaperScreen. SeeThroughGUI$MyApp
Could not FFMPEG frameGrabber… org.bytedeco.javacv.FrameGrabber$Exception: avformat_open_input() error -2: Could not open input “/dev/video1”. (Has setFormat() been called?)
Camera ID /dev/video1: could not start.
Check cable connection, ID and resolution asked.
No object for introspection, we use PApplet.
skatolo 2.1.1 infos, comments, questions at https://github.com/potioc/skatolo
Font-family: ‘Linux Biolinum’

To try the PCConfiguration example, I tried “GettingStartedCapture” example in the video library and it gaved me the code below.

Available cameras:
[0] “name=ATIV Real HD Camera,size=640x480,fps=30”
[1] “name=ATIV Real HD Camera,size=160x120,fps=30”
[2] “name=ATIV Real HD Camera,size=320x180,fps=30”
[3] “name=ATIV Real HD Camera,size=320x240,fps=30”
[4] “name=ATIV Real HD Camera,size=352x288,fps=30”
[5] “name=ATIV Real HD Camera,size=424x240,fps=30”
[6] “name=ATIV Real HD Camera,size=640x360,fps=30”
[7] “name=ATIV Real HD Camera,size=848x480,fps=10”
[8] “name=ATIV Real HD Camera,size=848x480,fps=20”
[9] “name=ATIV Real HD Camera,size=960x540,fps=10”
[10] “name=ATIV Real HD Camera,size=960x540,fps=15”
[11] “name=ATIV Real HD Camera,size=1280x720,fps=10”
[12] “name=ATIV Real HD Camera,size=640x480,fps=30”
[13] “name=ATIV Real HD Camera,size=160x120,fps=30”
[14] “name=ATIV Real HD Camera,size=320x180,fps=30”
[15] “name=ATIV Real HD Camera,size=320x240,fps=30”
[16] “name=ATIV Real HD Camera,size=352x288,fps=30”
[17] “name=ATIV Real HD Camera,size=424x240,fps=30”
[18] “name=ATIV Real HD Camera,size=640x360,fps=30”
[19] “name=ATIV Real HD Camera,size=848x480,fps=30”
[20] “name=ATIV Real HD Camera,size=960x540,fps=30”
[21] “name=ATIV Real HD Camera,size=1280x720,fps=30”

Can you help me out?


#16

Hello Seowoo

The easiest way to test is to use the OPENCV drivers with camera 0 as this would work on any operating system.
If you have only one camera plugged, and you are on linux you can use FFMPEG indeed with /dev/video0. You can check that the file exists before.

For you another possiblity is to use the PROCESSING video input and type ATIV Real HD Camera (copy paste does not work yet). You can select this in the PCConfiguration example.

The use of camera drivers is also discussed on this topic.


#17

Oh thanks! It worked perfectly, and I think it detects markers more accurately than the nyar4psg!! :slight_smile:

By the way, I thinking about using a Raspberri Pi, Processing and a IR Camera for the Raspberri Pi.
Will it be possible to use an IR camera and a Raspberri Pi?

And can you let me know where is the documentation of functions and methods of this library?
Plus will it be possible for me to create my custom marker?

Sorry for too many questions, thanks so much!


#18

Awesome :smiley:

Yes I suppose so, I just found out you can use the RPI camera as a UVC camera here at the section INSTALL UV4L CAMERA DRIVER.

Once it works as an UVC camera you can load it in PapARt with the OpenCV/FFMPEG drivers. I tested this a few months ago and @nclsp will work on RPI support soon. I could work with the Processing video drivers but I’m not sure. I don’t have a NoIR camera to test however.

The tutoriels are expanding on this forum: papart-tutorials category , and a documentation (javadoc) is available here.
Most features are exposed through the examples, and I will be glad to anwser your questions on this forum.

For custom markers, what do you mean ? We support ARToolkitPlus markers and natural feature tracking as an experimental feature. I will release a new version soon that ease the process of markerboard creation (with ARToolkitPlus markers). I guess we should also create a tutorial for natural feature tracking.

Cheers,
Jeremy.


#19

Thanks for the reply! :smile:

I have a NoIR camera indeed, so I’ll let you know if I fail or not after I try it!

And yes, I was saying how I could create my own markers. The files in the data folder seemed to have a format as SVG, so I was wondering if I could just create square markers in SVG format with the Adobe illustrator.

And is there any method that returns the 2D position of the 4 corners (top right, bottom right, top left, bottom left ) of the marker? I used to use the nyar4psg but currently rewriting the code to use the PapART, and the nyar4psg used to return the 4 values of the corner in the type of PVector. I found the method “getScreenPos()” but it only returned the value of bottom-right corner of the markerboard.

By the way, I’m not a professional coder, but just a design student, and trying to use AR for my project. There could be a lot of dummy questions but thanks for kindly replying it!


#20

For now you can use ARToolkitPlus markers in the SVG file, we follow a codified way to put them in the file. I will create a post about markerboard creation. You can create your own marker already with natural feature tracking: Natural feature tracking (any image) . It may not be as good as ARToolkit yet.

You can get the marker corners:

// DetectedMarker[] markers = papart.getMarkerList(); (outside a PaperScreen)
getCameraTracking().getDetectedMarkers(); // in a PaperScreen

if(markers.length >0){
  int id  = markers[0].getId();
  PVector[] markers[0].getCorners();
}

Full javadoc: http://dist.rea.lity.tech/papart/javadoc/

The getScreenPos() gives a 3D location from the camera point of view. The getCorners() will give you pixel positions in the camera image.

You can get the corresponding 3D point for a corner by calling this:

PVector p = getDisplay().projectPointer(this, px, py);   // in a PaperScreen

I will create a tutorial out of this.