User Tools

Site Tools


This binding has been deprecated - please see the most recent release notes for more information.

Interactive Bitmaps


In this tutorial, we further explore GestureWorks Core and the Java bindings by consuming gesture events obtained from GestureWorks Core and manipulating on-screen images represented as PImage objects within Processing. This tutorial also introduces Processing manipulation such as dragging, rotation and scaling, values for which you will obtain from GestureWorks.

In this application, the user is able to drag the objects (the blue GestureWorks logos) around the screen, scale them using the familiar “pinch”-style gesture, and rotate them using the n-finger rotate gesture. For this tutorial you will need the GestureWorks Core multitouch framework; a free trial is available.

Download the code for all of the Java & Processing multitouch tutorials here:

Java 3.1.png


Estimated time to complete: 45 minutes

This tutorial assumes that you have completed the steps found in Java & Processing: Getting Started I (Hello World) and Java & Processing: Getting Started II (Hello Multitouch). You should have a fully configured NetBeans project ready for use and should be familiar with consuming GestureWorks point event data. If you have have not yet done so, please follow both Java & Processing: Getting Started I (Hello World) to prepare a NetBeans project and Java & Processing: Getting Started II (Hello Multitouch) to become familiar with consuming GestureWorks touch event data.

In addition to the above, this tutorial expects the student to have a basic understanding of the Processing language, and is familiar with object-oriented programming concepts.

Process Overview

Process Detail

1. Create touch object

Create a TouchObject to load the image and to handle hit testing (see section 4 for hit testing details) and object transformations.

Add a TouchObject Java class to the project and setup the constructor, as follows, to require a PApplet object, specifying the sketch to draw to, and an image source path.

public TouchObject(PApplet sketch, String imgSrc)
    this.sketch = sketch;
    this.image = sketch.loadImage(imgSrc);
    this.width = image.width;
    this.height = image.height;

The draw method manages all transformations through a transform matrix. The idea is to declare global transformation properties and call this method on each frame, thus updating the matrix with property changes and applying the resulting transformation to the object.

Add the following draw method and declare the required public properties.

public void draw()
    drawX = GWCUtils.getDrawingCoordX(width, height, x, y, (float) Math.toRadians(rotation), scale);
    drawY = GWCUtils.getDrawingCoordY(width, height, x, y, (float) Math.toRadians(rotation), scale);
    sketch.translate(drawX, drawY);
    sketch.image(image, 0, 0);

2. Modify multitouch sketch from Tutorial #2

Now, modify and expand the existing TouchSketch object, from Tutorial #2, to display bitmaps and translate gesture events into interactions.

Create the object registration method to instantiate a TouchObject, store it in a hashmap, register it with GestureWorks, and register its gesture event listeners with GestureWorks.

This method takes an image source path and an object name as arguments.

public void addTouchObject(String path, String name)
    touchObjects.put(name, new TouchObject(this, path));
    gwCore.addGesture(name, "n-drag");
    gwCore.addGesture(name, "n-rotate");
    gwCore.addGesture(name, "n-scale");

Now expand the setup method to register two TouchObjects with the additional lines.

 addTouchObject("resources/Processing.png", "img1");
 addTouchObject("resources/Processing.png", "img2");
Modify the draw method to reflect the following.
public void draw() {
   background(169, 169, 169);
   for(TouchObject to: touchObjects.values())

Notice the addition of the GestureWorks call to consume GestureEvents along with an associated process method. Since this application will assign touch points to objects (the next step) GestureEvent data can be analyzed.

The second modification is the replacement of the touch point graphic with the iterative call to the draw method of the registered TouchObjects. This will update the objects’ transformation properties and apply the new transformations.

3. Process Touch Events

Again, create a method for processing touch events and check the status of each event. Only in this tutorial, check if new touch points collide with any of our touch objects. If they do, let GestureWorks know about it so it can add the touch point to the object and check for gesture events involving the new touch point.

private void processPointEvents(ArrayList<PointEvent> events)
    for(PointEvent event: events)
            case Status.TOUCHADDED:
                Object[] names = touchObjects.keySet().toArray();
                for(int i=names.length-1; i >= 0; i--)
                    String name = (String)names[i];
                    if(touchObjects.get(name).hitTest(event.position.x, event.position.y))
                        gwCore.addTouchPoint(name, event.point_id);
            case Status.TOUCHUPDATE:
            case Status.TOUCHREMOVED:

Handling touch event statuses other than TOUCHADDED is not required, but they are included here for illustration.

The touchObject list is reversed to synchronize with their order on the canvas.

4. Hit Test And Associate Touch Points With Touch Objects

Hit testing is pretty straight forward in this example. Iterate through all of our touch objects, rotate the touch point about the object’s center based on the object’s current rotation and check to see if the point lies within the object’s bounding box. Return a true boolean value on the first detection of a collision. Since this is meant to be a tutorial on how to use GestureWorks and not on two dimensional transformations, we provide a function to rotate a point about another arbitrary point (rotateAboutCenter) and leave it to the reader to investigate these concepts further.

Add this hitTest method to the TouchObject class.

   public boolean hitTest(float x, float y)
       float localX = GWCUtils.rotateAboutCenterX(x, y, this.x, this.y, -(float) Math.toRadians(rotation));
       float localY = GWCUtils.rotateAboutCenterY(x, y, this.x, this.y, -(float) Math.toRadians(rotation));      
       boolean inWidth = Math.abs(localX - this.x) <= (width*scale)/2;
       boolean inHeight = Math.abs(localY - this.y) <= (height*scale)/2;       
       return inWidth && inHeight;

5. Process Gesture Events

Add the following implementation to analyze the gesture events and execute their corresponding handlers.

private void processGestureEvents(ArrayList<GestureEvent> events)
    for(GestureEvent event: events)
        else if(event.gesture_id.equals("n-rotate")){
        else if(event.gesture_id.equals("n-scale")){

The method iterates through each event and checks its gesture_id property. These IDs will correspond to the gesture names that were registered for each touch object at the beginning of step 2. The names of the gestures are defined in the GML file loaded when GestureWorks was initialized. After the gesture is determined, it is passed to an associated handler method.

GestureWorks identifies which object the gesture applies to via the target field of the gesture event. This name will correspond to the string used to register the touch object in step 2.

  • For the most part, relevant gesture event data is stored in the values member of the event object. This member is a map of strings to floats. The string used to address any value will be based on that attribute’s name in the GML. See for how these are laid out. For our example, these attribute names can be viewed in the basic_manipulations.gml file loaded previously.
  • It is also important to note that these values are generally expressed as deltas; that is, they represent differentials between the current state and the previous state; eg. drag_dx and drag_dy each represent the change in position between this frame and the last.

6. Overview Of Object Transformations

Drag and scale gestures are pretty straight forward and handled in similar ways. GestureWorks gives us values for most gestures as delta values based on frame boundaries so the values can be used directly as incrementers.

Implement the drag handler by retrieving the associated TouchObject and applying the the translation deltas to its x and y values.

private void handleDrag(GestureEvent event)
    TouchObject to = touchObjects.get(;
    to.x += event.values.getValue("drag_dx");
    to.y += event.values.getValue("drag_dy");

Scale is handled the same way except it increments the TouchObject’s scale attribute.

private void handleScale(GestureEvent event)
    TouchObject to = touchObjects.get(;
    to.scale += event.values.getValue("scale_dsx");

Rotation is only slightly more complicated because it rotates around the center of the cluster of touch points activating the gesture. This gives a more realistic feel than simply rotating the object around its center. Notice the rotateX and rotateY attributes represent the center point of the cluster.

private void handleRotate(GestureEvent event)
    TouchObject to = touchObjects.get(;
    to.rotation += event.values.getValue("rotate_dtheta");
    to.rotateX = event.x;
    to.rotateY = event.y;

The object is rotated around it’s center and then rotated around the center of our gesture event. Note the second rotation is only executed if there are actual touch points activating the gesture. When an inertial filter is active, it is possible for the event to be triggered when there are no active touch points. In this case, the object should only be rotated about its own center. For more information on gesture filters, please see the GML overview. One final important note to make is GestureWorks provides rotation values in degrees but the rotate function expects radians.

7. Compile and run

Now, compile and run the application; you should see two images in the upper left corner that can be manipulated via drag, rotate, and scale interactions.


In this tutorial we expanded on the knowledge gained in Java & Processing: Getting Started II (Hello Multitouch) by manipulating on-screen objects using gesture data obtained from GestureWorks Core. This tutorial covers a number of concepts that are central to the usage of GestureWorks:

  • Registering touch objects
  • Adding gestures to the registered touch objects
  • Manipulating Processing’s PImage object instances based on data obtained from GestureWorks

A principal concept is that due to the language- and framework-independent nature of GestureWorks, it is the programmer’s responsibility to associate the local implementation of an object with an object as tracked by GestureWorks. To review: GestureWorks doesn’t know anything about the TouchObject class that we defined; it is our responsibility to associate the data received from GestureWorks with our in-application objects.

Continuing Education

It is the intention of this and the other Tutorials and Legacy Tutorials to get the programmer started using the GestureWorks core framework, and is by no means an exhaustive explanation of all of the features of GestureWorks Core; in fact, we’ve only barely scratched the surface!

For more information on GestureWorks Core, GestureML, and CreativeML, please visit the following sites:

Previous tutorial: Java & Processing: Getting Started II (Hello Multitouch)

tutorials/legacy/java_processing/interactive_bitmaps.txt · Last modified: 2019/01/21 19:24 (external edit)