No products in the cart.

MetaBow Toolkit

The MetaBow Toolkit is a set of abstractions designed around the MetaBow™ that work as a MaxMSP package. How do you navigate this package? Work your way through each section, starting from the framework, through the extractors, sensor control, and processing tabs to explore the endless possibilities offered by the MetaBow’s Toolkit.

Framework

The MetaBow Toolkit Framework is designed to allow both beginners and advanced Max users to incorporate the Metabow into their creative projects. All of the utilities afforded to you by the toolkit are presented as abstractions that can be used as a ‘bpatcher’ (with a graphical user interface) or as a plain object. For those that prefer tactile controls, and configuration by moving sliders, dials or panels the bpatcher interface is ideal. For an advanced user who is more comfortable with plain objects a GUI might be unnecessary. Nonetheless, the abstractions work in exactly the same way and it’s up to you to decide the type of interface that you need to work comfortably with these tools.

As you can see below, the same abstraction for processing accelerometer data “mb.accel” is demonstrated in both formats. They both work exactly the same, except for the plain abstraction the GUI is hidden which might be a small performance gain too if you have many instances of these.

1. MUBU

Part of the MetaBow toolkit is built on top of the IRCAM’s MuBu package. You can install the latest version from the Max package manager or the link below. This is particularly relevant for the components of the package that use gesture recognition.

https://forum.ircam.fr/projects/detail/mubu/

2. Antescofo

Part of the MetaBow toolkit is built on top of the Antescofo package from IRCAM. You can install it using the following link:

https://github.com/nadirB/atom-antescofo

http://forumnet.ircam.fr/fr/produit/antescofo/

3. Messages

All abstractions can be configured programatically with messages. Each abstraction stores its state internally as a dictionary which can then be saved to be loaded later or extracted for whatever purpose you see fit. Messages make controlling the abstractions simple and clear. No need to remember what inlet belongs to which control!

4. Connecting

Metabow Toolkit abstractions are designed to be plugged together. The convention of this follows some practical and common strategies for sensor processing (extraction, sanitisation, processing) while giving you the flexibility to do whatever you want with the data. The most important thing to understsand is that all information is communicated as messages containing a prefixed identifier. This means that the ‘route’ object is your new best friend!

Extractors

To assist you in receiving data from the MetaBow, we have introduced an ‘extractor’ abstraction. This aids you in connecting to the device and observing any changes to the integrity of the connection in real-time. While this toolkit is designed to be used with the MetaBoard, other extractors (e.g. Bitalino’s r-IoT, Kimura’s Mugic) are offered for you to integrate different interfaces.

Sensor Control

A priority when working with sensors is to make manageable the data received from the controller. This may involve smoothing the data or scaling it from one numerical range to another. As such, ‘sensor control’ abstractions help you to manipulate data before you send it to other processes where it is analysed or directly mapped on to controls.

1. Accelerometer

Isolate accelerometer data from your extractor of choice.

2. Magnetometer

Isolate magnetometer data from your extractor of choice.

3. Gyroscope

Isolate gyroscope data from your extractor of choice.

4. General Purpose Mapping and Scaling

Scaling is useful for taking an input number range and mapping it to an output number range. The abstraction can be programmatically altered in real-time and features a learn mode where it maintains a memory of the minimum and maximum numbers it receives in the input stream till it is switched back to the play mode. This can be useful if you don’t know the ranges to set in advance.

Processing

Once there is clean and workable sensor data we have the opportunity to analyse it. The MetaBow Processing family of abstractions is responsible for performing such tasks such as gesture recognition and gesture mapping with this data.

1. Continuous Gesture Recognition

Continuous Gesture Recognition is useful when you want to identify which gesture is being performed as well as ‘where’ in the gesture you currently are. Imagine drawing two different shapes. Your hand performs two unique gestures over time to do this. Continuous Gesture Recognition is able to recognise not only which shape you are drawing, but where in the drawing of that shape you are.

2. Static Gesture Recognition

Static gesture recognition is useful when you want to identify what “state” is currently being occupied. A “state” might refer to a position of the hand/bow/arm, or a more general type of motion and gesture. If you aren’t concerned about “where” in the gesture you are, it can sometimes be more accurate and robust than continuous gesture recognition.

Calibration

Sensors and audio levels will have different initial values due to different factors, e.g., location, hardware. Calibration is a necessary step for reproducibility of your work in different scenarios. You will thus need to calibrate the sensors, inputs, and objects according to your desired set up.

1. Antescofo

The first argument of a calibration message is the calibration level (shown here by the light blue flonum and multislider). It should stay above 0.75 (> 0.75) when it receives musical events and below 0.5 (< 0.5) when it does not (e.g. just picking up background noise). If the calibration level is not ideal, then the audio input level must be adjusted. In general, it is always better to perform the adjustments as close as possible to the beginning of the audio chain.

The second argument of a calibration message is pitch calibration (shown here by the yellow flonum and slider). Matching the tuning of antescofo~ and the instrument is also important. The instrument should play a reference A4: if the tuning is correct, the pitch calibration will approach 0.5. If not, either the instrument or antescofo~ require further tuning.

It is also possible to perform pitch calibration with a different reference pitch: just specify the corresponding MIDI note as the second argument to the calibrate the message you send to antescofo~.

Requirements

  • Max 8.1+
  • MUBU (Browse in Max Remote Packages)
  • CNMAT Externals (Browse in Max Remote Packages)
  • Antescofo: macOS, Windows
  • Bluetooth Connectivity