Search

Leonardo Merza

Full Stack Web Developer

LCD Temperature Project

The goal of this project was to test out the temperature sensor and LCD screen for a larger long term project. It involves using the DHT22 temperature and humidity sensor to display these values on a LCD. LEDs were also added as an extra representation of the temperature with 15 LEDs each representing a 2 degree Fahrenheit change in temperature from 60 degrees (green LEDs) to 90 degrees (red LEDs). The LCD screen’s contrast will be controlled with a potentiometer.

The DHT sensors come in DHT11 and DHT22 varieties. The DHT22 is twice the price but has more accuracy, resolution, and range than the DHT11. The DHT11 is adequate for most projects. The datasheet for the DHT22 shown here shows a humidity range of 0-100% and a temperature range of -40 to 80 degrees Celsius. The accuracy for the humidity and temperature are plus/minus 2% and 0.5 degree Celsius respectively. The resolution is 0.1% and 0.1 degree Celsius respectively as well. The datasheet provides the pin out and can be seen in Figure 1 below.

dht22 pinout
Figure 1 – Datasheet pin out of DHT22

The DHT22 needs a power supply range of 3.3 to 6V. In this project, 5V was given to the sensor. The sensor’s current is about 1.5mA and a pullup resistor with a value of 1kΩ is recommended on the data output.The DHT series sensor are digital sensors that send two 16-bit pieces of data along with an 8-bit checksum. The first 16-bit piece of data represents the humidity and the second 16-bit piece represents the temperature in Celsius. The first bit for the temperature signifies a positive or negative value with one meaning negative.

When the sensor is first turned on, the MCU must activate the DHT sensor by sending a pulsed signal. Once the sensor receives the pulse, it sends back a pulse to communicate with the MCU that it is ready to send sensor data. An illustration and recommended timings is shown in Figure 2.

dht-1
Figure 2 – Timings of initializing DHT sensor

Once the sensor is ready to send out data, it sends out each bit in 50uS increments. The data is read then converted into integer form to be displayed for the user. Another illustration from the datasheet is shown in Figure 3.

dht-2
Figure 3 – Timings of sensor data output to MCU

The LCD screen is a GDM1602K 16×2 character LCD and it’s datasheet can be viewed here. It is recommended a power supply voltage of 5V and runs at 1.5mA. The LCD stores the incoming data and contains an internal clock that displays the characters independent of the MCU clock speed. The LCD contains a register select and read/write select pins (pins 4 and 5) in order to control the output. There are also four bi-directional bus lines that receive the actual display data from the MCU. The register select pin allows the MCU to move the current cursor position on the LCD matrix to write to the screen. A table of functions such as erase, change cursor position, and read can be found in the datasheet. Once the cursor and settings have been set, a table at the end of the datasheet gives the 8-bit representation of all ascii characters that can be displayed on the LCD. A custom character can also be created by eight arrays of eight bits with each bit representing each pixel in a column and each array representing each row for a character on the LCD. A custom character generator program can be downloaded here.

The LCD also contains an operation enable at pin 6 and a contrast adjust at pin 3 that changes the contrast from a varying input voltage range of 0-5V. This contrast input will be controlled by a 10k potentiometer. Pins 11-14 are the DB4-DB7 inputs that actually write the data to the LCD screen. Overall, the LCD screen needs 6 pins from the MCU to function. The LCD pin put and dimensions can be viewed in Figure 4.

lcd
Figure 4 – LCD pin out and dimensions

The CD74HC4067E 16 channel multiplexer was used for the LED display because they were readily available. The multiplexer discussed in the lap timer project was used in the same configuration as the previous project but with only one 10k resistor used for the common input pin. The final schematic is shown in Figure 5 along with a picture and code below. Overall, this was a fairly simple project to test the DHT and LCD.

temp-lcd-circuit

Figure 5 – Complete Schematic of Project

IMG_20140512_193803

/***********************************************************************************
Created by: Leonardo Merza
Version: 1.02
***********************************************************************************/

/***********************************************************************************
Imports
***********************************************************************************/
#include <LiquidCrystal.h>
#include <DHT.h>

/***********************************************************************************
Variables
***********************************************************************************/
// RS, E, DB4, DB5, DB6, DB7 pins
int RSpin = 12;
int ePin = 11;
int db4Pin = 5;
int db5Pin = 4;
int db6Pin = 3;
int db7Pin = 2;

int lcdWidth = 16;
int lcdHeight = 2;

//multiplexer pins
int s0Pin = 6;
int s1Pin = 7;
int s2Pin = 8;
int s3Pin = 9;

int dhtPin = 10;

LiquidCrystal lcd(RSpin, ePin, db4Pin, db5Pin, db6Pin, db7Pin);
DHT dht;

// array of all multiplexer values possible
int muxPins[64] = {0,0,0,0, 1,0,0,0, 0,1,0,0, 1,1,0,0, 0,0,1,0, 1,0,1,0,
    0,1,1,0, 1,1,1,0, 0,0,0,1, 1,0,0,1, 0,1,0,1, 1,1,0,1, 0,0,1,1, 1,0,1,1, 0,1,1,1, 1,1,1,1};

/***********************************************************************************
***********************************************************************************/
void setup(){
  Serial.begin(9600);

  dht.setup(dhtPin);
  lcd.begin(lcdWidth, lcdHeight);

  pinMode(s0Pin,OUTPUT);
  pinMode(s1Pin,OUTPUT);
  pinMode(s2Pin,OUTPUT);
  pinMode(s3Pin,OUTPUT);
} // void setup(){

/***********************************************************************************
***********************************************************************************/
void loop(){
  // get DHT values
  float humidity = dht.getHumidity();
  float temperature = dht.toFahrenheit(dht.getTemperature());

  // set lcd and write temp values
  lcd.setCursor(0, 0);
  lcd.print("Temp: ");
  lcd.setCursor(5, 0);
  lcd.print(temperature);
  if(temperature > 100){
    lcd.setCursor(10, 0);
  }
  else{
    lcd.setCursor(10, 0);
  }
  lcd.print("F");

  // set lcd and write humidty values
  lcd.setCursor(0, 1);
  lcd.print("Humidity: ");
  lcd.setCursor(9, 1);
  lcd.print(humidity);
  if(humidity > 10){
    lcd.setCursor(14, 1);
  }
  else{
    lcd.setCursor(13, 1);
  }
  lcd.print("%");

  // start LED temps at 60 and light up LEDs based on current temp value
  int tempStart = 60;

  for(int i=0;i<64;i+=4){
    if(temperature>=tempStart){
      digitalWrite(6, muxPins[i]);
      digitalWrite(7, muxPins[i+1]);
      digitalWrite(8, muxPins[i+2]);
      digitalWrite(9, muxPins[i+3]);
    } // if(temperature<temp){
    tempStart+=2;
    // delay so multiplexer can light up all LEDs
    delayMicroseconds(400);
  } // for(int i=0;i<sizeof(muxPins;i+=4)){
} // void loop(){

Continue reading “LCD Temperature Project”

MSGEQ7

The MSGEQ7 is a single channel seven band graphic equalizer. It divides the audio spectrum into seven bands at the 63Hz, 160Hz, 400Hz, 1kHz, 2.5kHz, 6.25kHz, and 16kHz frequencies. It operates best at 5V and has a 20dB gain. The clock frequency is determined by the parallel RC circuit at pin 8. The default values of 33pF and 200kΩ allow an internal frequency of about 145kHz. The input impedance is 1MΩ with a supply current of about 1mA. The output impedance is 700 ohms with a supply current of 1mA.

The MSGEQ7 works by triggering the strobe pin to advance frequency channels with a delay of about 72uS from channel to channel. It is recommended an output settling time of 36uS. Each channel reads out a voltage that represents the volume level of that particular frequency band at that moment.

The schematic recommended for the MSGEQ7 is shown in Figure 1. The Arduino code below displays the seven values of each band on each column in the serial monitor. The datasheet can be viewed here.

Capture

Figure 1 – The schematic for the MSGEQ7 as per the datasheet

/*------------------------------------------------------------
Created by: Leonardo Merza
Version 1.0
------------------------------------------------------------*/

/*------------------------------------------------------------
Variables
------------------------------------------------------------*/
int analogPin = 0; // pin for msgeq7 input
int strobePin = 13; // msgeq7 strobe pin for cycling through channels
int resetPin = 1; // reset pin of the msgeq7
int spectrumValue[7]; // array to store the 7 values of the 7 channels
int resetDelay_USec = 100; // delay of resetting MSGEQ7 in microseconds
int strobeDelay_USec = 40; // delay to settle input recording in microseconds
int numberOfChannels = 7; // number of channels in

/*------------------------------------------------------------
Setup Method. Initializes all pins
------------------------------------------------------------*/
void setup() {
// open usb serial port
Serial.begin(9600);

// turn on pins for msgeq7
pinMode(analogPin, INPUT);
pinMode(strobePin, OUTPUT);
pinMode(resetPin, OUTPUT);

analogReference(DEFAULT);

// reset msgeq7
digitalWrite(resetPin, LOW);
digitalWrite(strobePin, HIGH);
} // void setup()

/*------------------------------------------------------------
Loop method. Resets msgeq7 and captures value of the
7 channels on the msgeq7.
------------------------------------------------------------*/
void loop()
{
digitalWrite(resetPin, HIGH);
delayMicroseconds(resetDelay_USec);
digitalWrite(resetPin, LOW);

for(int i=0; i<numberOfChannels; i++){
// start reading channel by changing strobe to low
digitalWrite(strobePin, LOW);
// allows input to settle to get accurate reading
delayMicroseconds(strobeDelay_USec);
// read value of current pin from msgeq7
spectrumValue[i] = analogRead(analogPin);
// print out value to serial monitor
Serial.print(spectrumValue[i]);
Serial.print(" ");

} // void loop()

Lap Timer

2014-05-08 23.16.54

This project was used to time a vehicle around a track by using a sonar sensor to detect when the vehicle passed by. What made this project challenging is wanting to use current components which included using a 16 channel multiplexer to drive the 4 digit 7-segment display instead of a display driver. This proved to be challenging code wise and revealed various problems that had to be solved from using such an unconventional way to display the digits. The project needed to be able to detect when the vehicle first passed the sensor to begin timing. Once the timer started, there needed to be a way to make sure the timer was not false triggered while the vehicle was still passing by the sensor. The timer must also be able to display the previous lap time long enough when the vehicle passed the sensor for a user to read the lap time while the next lap was still being recorded. Lastly, the timer needed to keep track and display what lap the vehicle was on. For this particular project, only three laps needed to be recorded.

The 4 digit 7-segment display was the clock display used for this project (the 7FR5643AS). The clock display shown below in Figure 1, contained 12 pins shown in Figure 2. Pins 6, 8, 9, and 12 represent each digit and needed to be activated one by one in order to multiplex each LED.

NFD-5643_Dimension Figure 1 – Pin out of 4 segment clock display

NFD-5643AS_Circuit

Figure 2 – Internal circuit of 4 segment clock display

This particular clock display is a sink activated display so the pins to activate each digit needed to be grounded. NPN transistors (the P2N2222A) were used with 1KΩ resistors for this. For the multiplexer, the CD74HC4067E 16 channel multiplexer was used since they were readily available. Only eight channels were actually used in order to drive the clock display. One problem that cropped up was since the display was driven by the multiplexer was that the channels on the multiplexer needed to be changed very fast to show the numbers correctly.   Initially, the channels were changed as soon as the LED was turned on but it was discovered that there was a capacitance in the multiplexer itself that was causing all the LEDs to turn on no matter what numeric value was trying to be displayed.  After a few tests, a 500 microsecond delay was added in between each LED activation in order to allow the capacitance nature of the multiplexer to drain before switching to the next channel.   Figure 3 shows the pin out of the 16 channel multiplexer. The common port was connected to 5V and 8 channels were used for the 7 LEDs for each digit plus the colon in the middle. 12KΩ resistors were used for each LED segment.

multi
Figure 3- Pin out of 16 channel multiplexer

The HC-SR04 ultrasonic sensor was used to detect the vehicle completing a lap. These sensors are similar to the PING sensor but require an extra pin. These were used for a previous project and showed to have a centimeter accuracy up to 4 meters. Even with the extra pin needed, they carried the same accuary as the PING sensor but were 4-5x cheaper from Amazon here. The datasheet found here shows a simple pulse function can be used to calculate the distance.  Dividing the return value by 148 allowed for conversion to inches. The pin out for the sensor is seen in Figure 4.

sonar pin
Figure 4 – Pin out for the ultrasonic sensor

Two LEDs with 1K resistors were used for the lap counter. Only two pins were available on the Arduino so the three laps were represented in binary form. The total schematic is shown in Figure 5. Looking back, the common port could hold one 12K resistor so that each LED segment can be directly tied to the multiplexer saving on resistors.

timer-circuit
Figure 5 – Full schematic of timer circuit

The code for the timer became overly complex as it required the timer to not false trigger while the vehicle was still passing by the sensor.  The timer was also required to show the lap time long enough for a reader to comfortably read while still recording the next lap time.  In order to do this, the vehicle was given a five second leeway to pass by the sensor.  While these five seconds were passing, the previous lap time was displayed (all zeros if the first lap).  Once the five seconds were passed, the display went back to showing the current lap time.  In order to display each digit using the multiplexer, the current time needed to be split up into each digit and compared against an array of length seven that represented each LED segment.  Overall, using the multiplexer to drive the clock display required a lot more work than would be needed if a simple display driver was purchased.  Either way, it was quite a learning experience code wise to manually drive the display with a multiplexer. A video of the timer project is shown below along with the full code.

/*****************************************************************************************
Created by: Leonardo Merza
Version: 1.05
*****************************************************************************************/

/*****************************************************************************************
Variables
*****************************************************************************************/
// multiplexer pins
int muxS0Pin = 2;
int muxS1Pin = 3;
int muxS2Pin = 4;
int muxS3Pin = 5;
// digit pins
int digitZeroPin = 8;
int digitOnePin = 9;
int digitTwoPin = 10;
int digitThreePin = 11;
// led pins
int LED0 = 12;
int LED1 = 13;

// sonar sensor pins
int trigPin = 6;
int echoPin = 7;

unsigned long currentTime = 0; // current running time
unsigned long lastTime = 0; // last time the sensor was triggered
unsigned long delayTime = 5000; // time to delay stop timer for sensor

// digits to display
unsigned long displayTimeDigit0 = 0;
unsigned long displayTimeDigit1 = 0;
unsigned long displayTimeDigit2 = 0;
unsigned long displayTimeDigit3 = 0;

long distance = 0; // min distance to trigger sensor
long thresholdDistance = 10; // max distance to trigger sensor
boolean didCarPassSensor = false; // initial boolean for start of sensor
boolean didCarHitSensor = false; // boolean if sensor is triggered again after start
int timeToDelay = 5000; // time to delay sensor reading again
int lapCounter = 0; // counter of lap

// to setup multiplexer to show each number - next line shows 4bit number of each segment in led
//0000 bottom left, 1000 bottom, 0100 dot, 1100 bottom right, 0010 middle, 1010 top left, 0110 top right, 1110 top
int digitZero[] = {0,0,0,0, 1,0,0,0, 1,1,0,0, 1,0,1,0, 0,1,1,0, 1,1,1,0, 1,1,1,1, 1,1,1,1};
int digitOne[] = {0,1,1,0, 1,1,0,0, 1,1,1,1, 1,1,1,1, 1,1,1,1, 1,1,1,1, 1,1,1,1, 1,1,1,1};
int digitTwo[] = {0,0,0,0, 1,1,1,1, 1,0,0,0, 0,0,1,0, 0,1,1,0, 1,1,1,0, 1,1,1,1, 1,1,1,1};
int digitThree[] = {1,0,0,0, 1,1,0,0, 0,0,1,0, 1,1,1,0, 0,1,1,0, 0,0,0,1, 1,1,1,1, 1,1,1,1};
int digitFour[] = {1,1,0,0, 0,0,1,0, 1,0,1,0, 0,1,1,0, 1,1,1,1, 1,1,1,1, 1,1,1,1, 1,1,1,1};
int digitFive[] = {1,0,0,0, 1,1,0,0, 0,0,1,0, 1,0,1,0, 1,1,1,0, 1,1,1,1, 1,1,1,1, 1,1,1,1};
int digitSix[] = {0,0,0,0, 1,0,0,0, 1,1,0,0, 0,0,1,0, 1,0,1,0, 1,1,1,0, 1,1,1,1, 1,1,1,1};
int digitSeven[] = {1,1,0,0, 0,1,1,0, 1,1,1,0, 1,1,1,1, 1,1,1,1, 1,1,1,1, 1,1,1,1, 1,1,1,1};
int digitEight[] = {0,0,0,0, 1,0,0,0, 1,1,0,0, 0,0,1,0, 1,0,1,0, 1,1,1,0, 0,1,1,0, 1,1,1,1};
int digitNine[] = {1,0,0,0, 1,1,0,0, 0,0,1,0, 1,0,1,0, 0,1,1,0, 1,1,1,0, 1,1,1,1, 1,1,1,1};
int digitDot[] = {0,1,0,0, 1,1,1,1, 1,1,1,1, 1,1,1,1, 1,1,1,1, 1,1,1,1, 1,1,1,1, 1,1,1,1};

/*****************************************************************************************
Setup function. Activates all pins and serial port
*****************************************************************************************/
void setup(){

 Serial.begin(9600);

 //lap leds
 pinMode(LED0, OUTPUT);
 pinMode(LED1, OUTPUT);

 // sonar sensor
 pinMode(trigPin, OUTPUT);
 pinMode(echoPin, INPUT);

 // make each pin an output pin
 pinMode(muxS0Pin,OUTPUT);
 pinMode(muxS1Pin,OUTPUT);
 pinMode(muxS2Pin,OUTPUT);
 pinMode(muxS3Pin,OUTPUT);

 pinMode(digitZeroPin,OUTPUT);
 pinMode(digitOnePin,OUTPUT);
 pinMode(digitTwoPin,OUTPUT);
 pinMode(digitThreePin,OUTPUT);
} // void setup()

/*****************************************************************************************
Setup function. Activates all pins and serial port
*****************************************************************************************/
void loop(){
 currentTime = millis(); // get current time
 mainFunction(); // call main function

 // for debugging
 //divideTime();
 //writeTime(displayTimeDigit0, displayTimeDigit1, displayTimeDigit2, displayTimeDigit3);
} // void loop()

/*****************************************************************************************
Function senses when sonar sensor is activated then displayes lap time when sensor trigged.
*****************************************************************************************/
void mainFunction(){

 if(!(didCarHitSensor)){
 if(checkIfSonarIsHigh()){
 didCarHitSensor = true;
 lastTime = millis();
 lapCounter++;
 } // if sonar is high then go to next step
 } // if car hasnt hit sensor then only check sensor

 if(didCarHitSensor && !(didCarPassSensor)){
 currentTime = millis() - lastTime;
 if(currentTime > timeToDelay){
 didCarPassSensor = true;
 } // if time has passed start showing time
 } // if car has pass sensor then wait for certain time to display time

 if(didCarPassSensor){
 currentTime = millis() - lastTime;
 divideTime();
 if(checkIfSonarIsHigh()){
 didCarPassSensor=false;
 lastTime = millis();
 lapCounter++;
 } // if car has passed sensor again then update lap time
 } // if car has passed sensor display lap time

 writeTime(displayTimeDigit0, displayTimeDigit1, displayTimeDigit2, displayTimeDigit3);
 if(didCarHitSensor)
 displayLap();
} // void mainFunction()

/*****************************************************************************************
take current time and split it up for display.
*****************************************************************************************/
void divideTime(){
 displayTimeDigit0 = currentTime/10000;
 displayTimeDigit1 = (currentTime - displayTimeDigit0*10000)/1000;
 displayTimeDigit2 = (currentTime - displayTimeDigit0*10000 - displayTimeDigit1*1000)/100;
 displayTimeDigit3 = (currentTime - displayTimeDigit0*10000 - displayTimeDigit1*1000 - displayTimeDigit2*100)/10;
} // void divideTime()

/*****************************************************************************************
Recieves 4 digits that were split up and activates each digit on display one by one in
order to upload array to display number.
*****************************************************************************************/
void writeTime(int displayTimeDigit0Temp, int displayTimeDigit1Temp, int displayTimeDigit2Temp, int displayTimeDigit3Temp){

 //pin low for digital 0 - the rest high
 digitalWrite(digitOnePin,LOW);
 digitalWrite(digitTwoPin, LOW);
 digitalWrite(digitThreePin,LOW);
 digitalWrite(digitZeroPin,HIGH);
 getArray(displayTimeDigit0Temp);

 //pin low for digital 1 - the rest high
 digitalWrite(digitZeroPin,LOW);
 digitalWrite(digitTwoPin,LOW);
 digitalWrite(digitThreePin,LOW);
 digitalWrite(digitOnePin,HIGH);
 getArray(displayTimeDigit1Temp);

 getArray(10); // activates colon

 //pin low for digital 2 - the rest high
 digitalWrite(digitOnePin,LOW);
 digitalWrite(digitThreePin,LOW);
 digitalWrite(digitZeroPin,LOW);
 digitalWrite(digitTwoPin,HIGH);
 getArray(displayTimeDigit2Temp);

 //pin low for digital 3 - the rest high
 digitalWrite(digitZeroPin,LOW);
 digitalWrite(digitOnePin,LOW);
 digitalWrite(digitTwoPin,LOW);
 digitalWrite(digitThreePin,HIGH);
 getArray(displayTimeDigit3Temp);
} // oid writeTime()

/*****************************************************************************************
Recieves digit to display and passes array representation down to multiplexer.
*****************************************************************************************/
void getArray(int numberToUse) {
 if(numberToUse == 0)
 displayNumber(digitZero);
 if(numberToUse == 1)
 displayNumber(digitOne);
 if(numberToUse == 2)
 displayNumber(digitTwo);
 if(numberToUse == 3)
 displayNumber(digitThree);
 if(numberToUse == 4)
 displayNumber(digitFour);
 if(numberToUse == 5)
 displayNumber(digitFive);
 if(numberToUse == 6)
 displayNumber(digitSix);
 if(numberToUse == 7)
 displayNumber(digitSeven);
 if(numberToUse == 8)
 displayNumber(digitEight);
 if(numberToUse == 9)
 displayNumber(digitNine);
 if(numberToUse == 10)
 displayNumber(digitDot);
} // void getArray(int numberToUse)

/*****************************************************************************************
Recieves array representation of number and activates multiplexer pins.
*****************************************************************************************/
void displayNumber(int numberArray[]) {
 for(int i=0;i<32;i++){
 // mux has a capacitance that will carry over - wait to drain cap for next led
 delayMicroseconds(500);
 digitalWrite(muxS0Pin,numberArray[i]);
 i++;
 digitalWrite(muxS1Pin,numberArray[i]);
 i++;
 digitalWrite(muxS2Pin,numberArray[i]);
 i++;
 digitalWrite(muxS3Pin,numberArray[i]);
 } // for(int i=0;i<32;i++)
} // void displayNumber(int numberArray[], int whichDigital)

/*****************************************************************************************
Checks in sonar sensor is high and retrns boolean values if sensor is high or not.
*****************************************************************************************/
boolean checkIfSonarIsHigh(){
 digitalWrite(trigPin, LOW);
 delayMicroseconds(2);
 digitalWrite(trigPin, HIGH);
 delayMicroseconds(2);
 digitalWrite(trigPin, LOW);
 // Compute distance
 distance = pulseIn(echoPin, HIGH, 1000);
 distance = distance/ 148;
 Serial.println(distance);

 if(distance > 0){
 return true;
 } else{
 return false;
 } // if sensor detects opject retuen true else retuen false
} // boolean checkIfSonarIsHigh()

/*****************************************************************************************
display led binary representation of current lap.
*****************************************************************************************/
void displayLap(){
 if(lapCounter == 1){digitalWrite(LED0, HIGH); digitalWrite(LED1, LOW);}
 else if(lapCounter==2){digitalWrite(LED1, HIGH); digitalWrite(LED0, LOW);}
 else if(lapCounter==3){digitalWrite(LED0, HIGH); digitalWrite(LED1, HIGH);}
 else{lapCounter=1;}
} // void displayLap()

7×10 LED Matrix Powered by MSGEQ7

This project uses the MSGEQ7 in order to power a 7 channel LED equalizer. The goal was to use current components that were already avilable. Using 70 LEDs, a 7×10 matrix was created by soldering the anodes in each row together and the cathodes in each column together. The matrix setup can be seen in Figure 1 below.

MSGEQ7 LED Matrix

Figure 1 – Back of LED matrix

Some 16 channel multiplexers (cd74hc4067e) were laying around so they were used to light a certain column and row at the same time making one LED light up at a time. The datasheet can be found . The common input was tied to 5v or ground. The MSGEQ7 was used to separate the audio signal into 7 channels. Since the average audio range for a human is 20Hz-20kHz, the IC separates the singal by 63Hz, 160Hz, 400Hz, 1kHz, 2.5kHz, 6.25kHz and 16kHz. These seven signals are 5V logic and it’s voltage value is dependent on the volume of the signal. The datasheet contains the components necessary to use the chip and can be found . The ATmega328P with the Arduino bootloader and a 16MHz crystal oscillator was use to program the LED matrix. All of this was powered by a 5V regulator. The finished prototype is shown in the video below.

The Arduino code is here:

/*------------------------------------------------------------
Created by: Leonardo Merza
Version 1.0
------------------------------------------------------------*/

/*------------------------------------------------------------
Variables
------------------------------------------------------------*/
int analogPin = 0; // pin for msgeq7 input
int strobePin = 13; // msgeq7 strobe pin for cycling through channels
int resetPin = 12; // reset pin of the msgeq7
int spectrumValue = 0; // current spectrum value
int highLowDelay = 0; // delay of resetting msgeq7
int strobeDelay_USec = 15; // delay to settle input recording in microseconds
int numberOfChannels = 7; // number of channels in
int divider = 100;
int multiplier = 1;

int multiBlack[4] = {2,3,4,5};
int multiRed[4] = {6,7,8,9};

int muxChannel[9][4]={
    {0,0,0,0}, //channel 0
    {1,0,0,0}, //channel 1
    {0,1,0,0}, //channel 2
    {1,1,0,0}, //channel 3
    {0,0,1,0}, //channel 4
    {1,0,1,0}, //channel 5
    {0,1,1,0}, //channel 6
    {1,1,1,0}, //channel 7
    {0,0,0,1}, //channel 8
  };

/*------------------------------------------------------------
Setup Method.  Initializes all pins
------------------------------------------------------------*/
void setup() {
 // open usb serial port
 Serial.begin(9600);

 // turn on pins for msgeq7
 pinMode(analogPin, INPUT);
 pinMode(strobePin, OUTPUT);
 pinMode(resetPin, OUTPUT);

 for(int i=0;i<8;i++) {
   pinMode(i+2,OUTPUT);
 }

 analogReference(DEFAULT);

 // reset msgeq7
 digitalWrite(resetPin, LOW);
 digitalWrite(strobePin, HIGH);
} // void setup()

/*------------------------------------------------------------
Loop method. Resets msgeq7 and captures value of the
7 channels on the msgeq7.
------------------------------------------------------------*/
void loop()
{
 digitalWrite(resetPin, HIGH);
 delay(highLowDelay);
 digitalWrite(resetPin, LOW);

 for (int i=0; i<numberOfChannels;i++) {
   // start reading channel by changing strobe to low
   digitalWrite(strobePin, LOW);
   // allows input to settle to get accurate reading
   delayMicroseconds(strobeDelay_USec);
   // read value of current pin from msgeq7
   spectrumValue = analogRead(analogPin);

   // print out value to serial monitor
   Serial.print(spectrumValue);
   Serial.print(" ");

   for(int j=0;j<4;j++) {
     digitalWrite(multiBlack[j], muxChannel[i][j]);
   } // for(int j=0;j<4;j++)

   for(int n=1+multiplier;n<10+multiplier;n++) {
     for(int k=0;k<n-multiplier;k++) {
       if(spectrumValue > divider*n) {
         for(int m=0;m<4;m++) {
           digitalWrite(multiRed[m], muxChannel[k][m]);
         } // for(int m=0;m<4;m++)u
       } // if(spectrumValue[i] > 30*n)
     } // for(int k=0;k<n;k++)
   } // for(int n=1;n<10;n++)

  // strobe pin high the low to go to next channel on msgeq7
   digitalWrite(strobePin, HIGH);

 } // for (int i = 0; i < numberOfChannels; i++)
 Serial.println();

} // void loop()

Kinect Invisible Drumset

This program uses the RGB Camera and Tracking program to display a tracked user in front of a background. A drum set image is then displayed over the user. When the kinect is tracking the user, the z coordinate of the drum area is relative to the user’s right knee. Certain x/y coordinates are then used to activate sounds that go along with the drum image.

This program has a problem in that when the user sits down, the kinect has trouble tracking the user at a high confidence level. This seems to be a SimpleOpenNI limitation. The kick drum is also not very accurate. A physical button to push would be better to use for the kick drum. Lastly, the way processing is setup, only one sound can be made at a time. A way around this would be to have a separate sound for each combination of drums you can hit. From there, a bunch of if-else statements can be written to see if multiple drums are being hit to activate the new sounds with multiple drums being hit. Unfortunately, I don’t have all this implemented as it would require a lot of code writing/rewriting. Mainly, having each combination of drum hits returning a boolean value to look through to activate the corresponding sound. A number of files including the background, drum, and sounds files are needed.

The code is long so I’ve updated the code to my github repository here.

RGB Camera and Tracking

Untitled

 

This program will track a user and take their outline and paste it onto any background you want.  The setup method includes where you need to add the name of the picture you want to use as the background.  This picture needs to be stored in the same parent folder as the program itself.

/*---------------------------------------------------------------
Created by: Leonardo Merza
Version: 1.0
----------------------------------------------------------------*/

/*---------------------------------------------------------------
Imports
----------------------------------------------------------------*/
import SimpleOpenNI.*;

/*---------------------------------------------------------------
Variables
----------------------------------------------------------------*/
// create kinect object
SimpleOpenNI  kinect;
// boolean if kinect is tracking
boolean tracking = false;
// current userid of tracked user
int userID;
// mapping of users
int[] userMapping;
// background image
PImage backgroundImage;
// image from rgb camera
PImage rgbImage;

/*---------------------------------------------------------------
Setup method. Enables kinect and draw window
----------------------------------------------------------------*/
void setup() {
  // start new kinect object
  kinect = new SimpleOpenNI(this);
  //enable depth camera
  kinect.enableDepth();
  // enable color camera
  kinect.enableRGB();
  // enable tracking
  kinect.enableUser();

  // turn on depth-color alignment
  kinect.alternativeViewPointDepthToImage(); 

  // load the background image
  backgroundImage = loadImage("wef.png");
  // create window width/height of rgb camera
  size(kinect.rgbWidth(),kinect.rgbHeight());
} // void setup()

/*---------------------------------------------------------------
Draw method.
----------------------------------------------------------------*/
void draw() {

  // display the background image first at (0,0)
  image(backgroundImage, 0, 0);
  //update kinect image
  kinect.update();

  // get the Kinect color image
  rgbImage = kinect.rgbImage();
  // prepare the color pixels
  loadPixels();
  // get pixels for the user tracked
  userMapping = kinect.userMap();

  // for the length of the pixels tracked, color them
  // in with the rgb camera
  for (int i =0; i < userMapping.length; i++) {
    // if the pixel is part of the user
    if (userMapping[i] != 0) {
      // set the sketch pixel to the rgb camera pixel
      pixels[i] = rgbImage.pixels[i];
    } // if (userMap[i] != 0)
  } // (int i =0; i < userMap.length; i++)

  // update any changed pixels
  updatePixels();
} // void draw()

/*---------------------------------------------------------------
When a new user is found, print new user detected along with
userID and start pose detection.  Input is userID
----------------------------------------------------------------*/
void onNewUser(SimpleOpenNI curContext, int userId){
  println("New User Detected - userId: " + userId);
  // start tracking of user id
  curContext.startTrackingSkeleton(userId);
} //void onNewUser(SimpleOpenNI curContext, int userId)

/*---------------------------------------------------------------
Print when user is lost. Input is int userId of user lost
----------------------------------------------------------------*/
void onLostUser(SimpleOpenNI curContext, int userId){
  // print user lost and user id
  println("User Lost - userId: " + userId);
} //void onLostUser(SimpleOpenNI curContext, int userId)

Kinect Joint Tracking

Untitled

This program is similar to the basic skeleton tracking but also extracts position datas is he.  This program will display the angle of all 4 major joints in your body in radians.  This is helpful for robotic applications.

 

/*---------------------------------------------------------------
Created by: Leonardo Merza
Version: 1.0
----------------------------------------------------------------*/

/*---------------------------------------------------------------
Imports
----------------------------------------------------------------*/
// import kinect library
import SimpleOpenNI.*;

/*---------------------------------------------------------------
Variables
----------------------------------------------------------------*/
// create kinect object
SimpleOpenNI kinect;
// image storage from kinect
PImage kinectDepth;
// int of each user being tracked
int[] userID;
// user colors
color[] userColor = new color[]{ color(255,0,0), color(0,255,0), color(0,0,255),
 color(255,255,0), color(255,0,255), color(0,255,255)};

// postion of head to draw circle
PVector headPosition = new PVector();
// turn headPosition into scalar form
float distanceScalar;
// diameter of head drawn in pixels
float headSize = 200;

// threshold of level of confidence
float confidenceLevel = 0.5;
// the current confidence level that the kinect is tracking
float confidence;
// vector of tracked head for confidence checking
PVector confidenceVector = new PVector();
// vector to scalar ratio
float vectorScalar = 525;
// size of drawn dot on each jpint
float dotSize = 30;

// Vector values for all joints
PVector SKEL_HEAD = new PVector();
PVector SKEL_LEFT_SHOULDER = new PVector();
PVector SKEL_LEFT_ELBOW = new PVector();
PVector SKEL_LEFT_HAND = new PVector();
PVector SKEL_RIGHT_SHOULDER = new PVector();
PVector SKEL_RIGHT_ELBOW = new PVector();
PVector SKEL_RIGHT_HAND = new PVector();
PVector SKEL_TORSO = new PVector();
PVector SKEL_LEFT_HIP = new PVector();
PVector SKEL_LEFT_KNEE = new PVector();
PVector SKEL_LEFT_FOOT = new PVector();
PVector SKEL_RIGHT_HIP = new PVector();
PVector SKEL_RIGHT_KNEE = new PVector();
PVector SKEL_RIGHT_FOOT = new PVector();

// z coordinates of each limb
float SKEL_HEADZ;
float SKEL_LEFT_SHOULDERZ;
float SKEL_LEFT_ELBOWZ;
float SKEL_LEFT_HANDZ;
float SKEL_RIGHT_SHOULDERZ;
float SKEL_RIGHT_ELBOWZ;
float SKEL_RIGHT_HANDZ;
float SKEL_TORSOZ;
float SKEL_LEFT_HIPZ;
float SKEL_LEFT_KNEEZ;
float SKEL_LEFT_FOOTZ;
float SKEL_RIGHT_HIPZ;
float SKEL_RIGHT_KNEEZ;
float SKEL_RIGHT_FOOTZ;

// angle variables
float leftShoulderElbowX;
float leftShoulderElbowY;
float leftShoulderElbowZ;
float leftWristElbowX;
float leftWristElbowY;
float leftWristElbowZ;

float rightShoulderElbowX;
float rightShoulderElbowY;
float rightShoulderElbowZ;
float rightWristElbowX;
float rightWristElbowY;
float rightWristElbowZ;

float leftHipKneeX;
float leftHipKneeY;
float leftHipKneeZ;
float leftFootKneeX;
float leftFootKneeY;
float leftFootKneeZ;

float rightHipKneeX;
float rightHipKneeY;
float rightHipKneeZ;
float rightFootKneeX;
float rightFootKneeY;
float rightFootKneeZ;

// actual angles in radians of knees and elbows
float leftElbowAngle;
float rightElbowAngle;
float leftKneeAngle;
float rightKneeAngle;

/*---------------------------------------------------------------
Starts new kinect object and enables skeleton tracking.
Draws window
----------------------------------------------------------------*/
void setup()
{
 // start a new kinect object
 kinect = new SimpleOpenNI(this);

 // enable depth sensor
 kinect.enableDepth();

 // enable skeleton generation for all joints
 kinect.enableUser();

 // draw thickness of drawer
 strokeWeight(3);
 // smooth out drawing
 smooth();

 // create a window the size of the depth information
 size(kinect.depthWidth(), kinect.depthHeight());
} // void setup()

/*---------------------------------------------------------------
Updates Kinect. Gets users tracking and draws skeleton and
head if confidence of tracking is above threshold
----------------------------------------------------------------*/
void draw(){
 // update the camera
 kinect.update();
 // get Kinect data
 kinectDepth = kinect.depthImage();
 // draw depth image at coordinates (0,0)
 image(kinectDepth,0,0);

 // get all user IDs of tracked users
 userID = kinect.getUsers();

 // loop through each user to see if tracking
 for(int i=0;i<userID.length;i++)
 {
 // if Kinect is tracking certain user then get joint vectors
 if(kinect.isTrackingSkeleton(userID[i]))
 {
 // get confidence level that Kinect is tracking head
 confidence = kinect.getJointPositionSkeleton(userID[i],
 SimpleOpenNI.SKEL_HEAD,confidenceVector);

 // if confidence of tracking is beyond threshold, then track user
 if(confidence > confidenceLevel)
 {
 // change draw color based on hand id#
 stroke(userColor[(i)]);
 // fill the ellipse with the same color
 fill(userColor[(i)]);
 // get coordinates of all joints
 getCoordinates(userID[i]);
 // subtract vectors of limbs
 subtractVectors();
 // get angles of joints
 getJointAngles();
 } //if(confidence > confidenceLevel)
 } //if(kinect.isTrackingSkeleton(userID[i]))
 } //for(int i=0;i<userID.length;i++)
} // void draw()

/*---------------------------------------------------------------
When a new user is found, print new user detected along with
userID and start pose detection. Input is userID
----------------------------------------------------------------*/
void onNewUser(SimpleOpenNI curContext, int userId){
 println("New User Detected - userId: " + userId);
 // start tracking of user id
 curContext.startTrackingSkeleton(userId);
} //void onNewUser(SimpleOpenNI curContext, int userId)

/*---------------------------------------------------------------
Print when user is lost. Input is int userId of user lost
----------------------------------------------------------------*/
void onLostUser(SimpleOpenNI curContext, int userId){
 // print user lost and user id
 println("User Lost - userId: " + userId);
} //void onLostUser(SimpleOpenNI curContext, int userId)

/*---------------------------------------------------------------
Called when a user is tracked.
----------------------------------------------------------------*/
void onVisibleUser(SimpleOpenNI curContext, int userId){
} //void onVisibleUser(SimpleOpenNI curContext, int userId)

/*---------------------------------------------------------------
Gets XYZ coordinates of all joints of tracked user and draws
a small circle on each joint
----------------------------------------------------------------*/
void getCoordinates(int userID)
{
 // get postion of all joints
 kinect.getJointPositionSkeleton(userID,
 SimpleOpenNI.SKEL_HEAD,SKEL_HEAD);
 kinect.getJointPositionSkeleton(userID,
 SimpleOpenNI.SKEL_LEFT_SHOULDER,SKEL_LEFT_SHOULDER);
 kinect.getJointPositionSkeleton(userID,
 SimpleOpenNI.SKEL_LEFT_ELBOW,SKEL_LEFT_ELBOW);
 kinect.getJointPositionSkeleton(userID,
 SimpleOpenNI.SKEL_LEFT_HAND,SKEL_LEFT_HAND);
 kinect.getJointPositionSkeleton(userID,
 SimpleOpenNI.SKEL_RIGHT_SHOULDER,SKEL_RIGHT_SHOULDER);
 kinect.getJointPositionSkeleton(userID,
 SimpleOpenNI.SKEL_RIGHT_ELBOW,SKEL_RIGHT_ELBOW);
 kinect.getJointPositionSkeleton(userID,
 SimpleOpenNI.SKEL_RIGHT_HAND,SKEL_RIGHT_HAND);
 kinect.getJointPositionSkeleton(userID,
 SimpleOpenNI.SKEL_TORSO,SKEL_TORSO);
 kinect.getJointPositionSkeleton(userID,
 SimpleOpenNI.SKEL_LEFT_HIP,SKEL_LEFT_HIP);
 kinect.getJointPositionSkeleton(userID,
 SimpleOpenNI.SKEL_LEFT_KNEE,SKEL_LEFT_KNEE);
 kinect.getJointPositionSkeleton(userID,
 SimpleOpenNI.SKEL_LEFT_FOOT,SKEL_LEFT_FOOT);
 kinect.getJointPositionSkeleton(userID,
 SimpleOpenNI.SKEL_RIGHT_HIP,SKEL_RIGHT_HIP);
 kinect.getJointPositionSkeleton(userID,
 SimpleOpenNI.SKEL_RIGHT_KNEE,SKEL_RIGHT_KNEE);
 kinect.getJointPositionSkeleton(userID,
 SimpleOpenNI.SKEL_RIGHT_FOOT,SKEL_RIGHT_FOOT);

 // convert real world point to projective space
 kinect.convertRealWorldToProjective(SKEL_HEAD,
 SKEL_HEAD);
 kinect.convertRealWorldToProjective(SKEL_LEFT_SHOULDER,
 SKEL_LEFT_SHOULDER);
 kinect.convertRealWorldToProjective(SKEL_LEFT_ELBOW,
 SKEL_LEFT_ELBOW);
 kinect.convertRealWorldToProjective(SKEL_LEFT_HAND,
 SKEL_LEFT_HAND);
 kinect.convertRealWorldToProjective(SKEL_RIGHT_SHOULDER,
 SKEL_RIGHT_SHOULDER);
 kinect.convertRealWorldToProjective(SKEL_RIGHT_ELBOW,
 SKEL_RIGHT_ELBOW);
 kinect.convertRealWorldToProjective(SKEL_RIGHT_HAND,
 SKEL_RIGHT_HAND);
 kinect.convertRealWorldToProjective(SKEL_TORSO,
 SKEL_TORSO);
 kinect.convertRealWorldToProjective(SKEL_LEFT_HIP,
 SKEL_LEFT_HIP);
 kinect.convertRealWorldToProjective(SKEL_LEFT_KNEE,
 SKEL_LEFT_KNEE);
 kinect.convertRealWorldToProjective(SKEL_LEFT_FOOT,
 SKEL_LEFT_FOOT);
 kinect.convertRealWorldToProjective(SKEL_RIGHT_HIP,
 SKEL_RIGHT_HIP);
 kinect.convertRealWorldToProjective(SKEL_RIGHT_KNEE,
 SKEL_RIGHT_KNEE);
 kinect.convertRealWorldToProjective(SKEL_RIGHT_FOOT,
 SKEL_RIGHT_FOOT);

 // scale z vector of each joint to scalar form
 SKEL_HEADZ = (vectorScalar/SKEL_HEAD.z);
 SKEL_LEFT_SHOULDERZ = (vectorScalar/SKEL_LEFT_SHOULDER.z);
 SKEL_LEFT_ELBOWZ = (vectorScalar/SKEL_LEFT_ELBOW.z);
 SKEL_LEFT_HANDZ = (vectorScalar/SKEL_LEFT_HAND.z);
 SKEL_RIGHT_SHOULDERZ = (vectorScalar/SKEL_RIGHT_SHOULDER.z);
 SKEL_RIGHT_ELBOWZ = (vectorScalar/SKEL_RIGHT_ELBOW.z);
 SKEL_RIGHT_HANDZ = (vectorScalar/SKEL_RIGHT_HAND.z);
 SKEL_TORSOZ = (vectorScalar/SKEL_TORSO.z);
 SKEL_LEFT_HIPZ = (vectorScalar/SKEL_LEFT_HIP.z);
 SKEL_LEFT_KNEEZ = (vectorScalar/SKEL_LEFT_KNEE.z);
 SKEL_LEFT_FOOTZ = (vectorScalar/SKEL_LEFT_FOOT.z);
 SKEL_RIGHT_HIPZ = (vectorScalar/SKEL_RIGHT_HIP.z);
 SKEL_RIGHT_KNEEZ = (vectorScalar/SKEL_RIGHT_KNEE.z);
 SKEL_RIGHT_FOOTZ = (vectorScalar/SKEL_RIGHT_FOOT.z);

 // fill the dot color by the user id
 fill(userColor[userID-1]);

 // draw the circle at the position of the joint with the
 // diameter dependent on the z axis
 ellipse(SKEL_HEAD.x,SKEL_HEAD.y,
 SKEL_HEADZ*dotSize,SKEL_HEADZ*dotSize);
 ellipse(SKEL_LEFT_SHOULDER.x,SKEL_LEFT_SHOULDER.y,
 SKEL_LEFT_SHOULDERZ*dotSize,SKEL_LEFT_SHOULDERZ
 *dotSize);
 ellipse(SKEL_LEFT_ELBOW.x,SKEL_LEFT_ELBOW.y,
 SKEL_LEFT_ELBOWZ*dotSize,SKEL_LEFT_ELBOWZ
 *dotSize);
 ellipse(SKEL_LEFT_HAND.x,SKEL_LEFT_HAND.y,
 SKEL_LEFT_HANDZ*dotSize,SKEL_LEFT_HANDZ
 *dotSize);
 ellipse(SKEL_RIGHT_SHOULDER.x,SKEL_RIGHT_SHOULDER.y,
 SKEL_RIGHT_SHOULDERZ*dotSize,SKEL_RIGHT_SHOULDERZ
 *dotSize);
 ellipse(SKEL_RIGHT_ELBOW.x,SKEL_RIGHT_ELBOW.y,
 SKEL_RIGHT_ELBOWZ*dotSize,SKEL_RIGHT_ELBOWZ
 *dotSize);
 ellipse(SKEL_RIGHT_HAND.x,SKEL_RIGHT_HAND.y,
 SKEL_RIGHT_HANDZ*dotSize,SKEL_RIGHT_HANDZ
 *dotSize);
 ellipse(SKEL_TORSO.x,SKEL_TORSO.y,
 SKEL_TORSOZ*dotSize,SKEL_TORSOZ
 *dotSize);
 ellipse(SKEL_LEFT_HIP.x,SKEL_LEFT_HIP.y,
 SKEL_LEFT_HIPZ*dotSize,SKEL_LEFT_HIPZ
 *dotSize);
 ellipse(SKEL_LEFT_KNEE.x,SKEL_LEFT_KNEE.y,
 SKEL_LEFT_KNEEZ*dotSize,SKEL_LEFT_KNEEZ
 *dotSize);
 ellipse(SKEL_LEFT_FOOT.x,SKEL_LEFT_FOOT.y,
 SKEL_LEFT_FOOTZ*dotSize,SKEL_LEFT_FOOTZ
 *dotSize);
 ellipse(SKEL_RIGHT_HIP.x,SKEL_RIGHT_HIP.y,
 SKEL_RIGHT_HIPZ*dotSize,SKEL_RIGHT_HIPZ
 *dotSize);
 ellipse(SKEL_RIGHT_KNEE.x,SKEL_RIGHT_KNEE.y,
 SKEL_RIGHT_KNEEZ*dotSize,SKEL_RIGHT_KNEEZ
 *dotSize);
 ellipse(SKEL_RIGHT_FOOT.x,SKEL_RIGHT_FOOT.y,
 SKEL_RIGHT_FOOTZ*dotSize,SKEL_RIGHT_FOOTZ
 *dotSize);
} // void getCoordinates()
/*---------------------------------------------------------------
Subtracts each vector from each limb combination from each other
----------------------------------------------------------------*/
void subtractVectors() {
 // take vector[] shoulder and subtract from vector[] elbow
 leftShoulderElbowX = SKEL_LEFT_SHOULDER.x - SKEL_LEFT_ELBOW.x;
 leftShoulderElbowY = SKEL_LEFT_SHOULDER.y - SKEL_LEFT_ELBOW.y;
 leftShoulderElbowZ = SKEL_LEFT_SHOULDER.z - SKEL_LEFT_ELBOW.z;
 // take vector[] hand and subtract from vector[] elbow
 leftWristElbowX = SKEL_LEFT_HAND.x - SKEL_LEFT_ELBOW.x;
 leftWristElbowY = SKEL_LEFT_HAND.y - SKEL_LEFT_ELBOW.y;
 leftWristElbowZ = SKEL_LEFT_HAND.z - SKEL_LEFT_ELBOW.z;
 // take vector[] shoulder and subtract from vector[] elbow
 rightShoulderElbowX = SKEL_RIGHT_SHOULDER.x - SKEL_RIGHT_ELBOW.x;
 rightShoulderElbowY = SKEL_RIGHT_SHOULDER.y - SKEL_RIGHT_ELBOW.y;
 rightShoulderElbowZ = SKEL_RIGHT_SHOULDER.z - SKEL_RIGHT_ELBOW.z;
 // take vector[] hand and subtract from vector[] elbow
 rightWristElbowX = SKEL_RIGHT_HAND.x - SKEL_RIGHT_ELBOW.x;
 rightWristElbowY = SKEL_RIGHT_HAND.y - SKEL_RIGHT_ELBOW.y;
 rightWristElbowZ = SKEL_RIGHT_HAND.z - SKEL_RIGHT_ELBOW.z;

 // take vector[] hip and subtract from vector[] knee
 leftHipKneeX = SKEL_LEFT_HIP.x - SKEL_LEFT_KNEE.x;
 leftHipKneeY = SKEL_LEFT_HIP.y - SKEL_LEFT_KNEE.y;
 leftHipKneeZ = SKEL_LEFT_HIP.z - SKEL_LEFT_KNEE.z;
 // take vector[] foot and subtract from vector[] knee
 leftFootKneeX = SKEL_LEFT_FOOT.x - SKEL_LEFT_KNEE.x;
 leftFootKneeY = SKEL_LEFT_FOOT.y - SKEL_LEFT_KNEE.y;
 leftFootKneeZ = SKEL_LEFT_FOOT.z - SKEL_LEFT_KNEE.z;
 // take vector[] hip and subtract from vector[] knee
 rightHipKneeX = SKEL_RIGHT_HIP.x - SKEL_RIGHT_KNEE.x;
 rightHipKneeY = SKEL_RIGHT_HIP.y - SKEL_RIGHT_KNEE.y;
 rightHipKneeZ = SKEL_RIGHT_HIP.z - SKEL_RIGHT_KNEE.z;
 // take vector[] foot and subtract from vector[] knee
 rightFootKneeX = SKEL_RIGHT_FOOT.x - SKEL_RIGHT_KNEE.x;
 rightFootKneeY = SKEL_RIGHT_FOOT.y - SKEL_RIGHT_KNEE.y;
 rightFootKneeZ = SKEL_RIGHT_FOOT.z - SKEL_RIGHT_KNEE.z;
} // void getAngles()

/*---------------------------------------------------------------
Gets angles of joints based on Z axis and prints them.
Example math:

Let the coordinates of elbow (x,y,z) be denoted as E,
shoulder S, and wrist W.

Define the vector EW = W - E = (x_w-x_e, y_w-y_e, z_w,z_e)
and ES = S - E.

Then the angle between vector EW and vector ES is given
by arccos(EW dot ES / (mag(EW) * mag (ES)).

Where mag(X) is the magnitude of a vector, given by
sqrt(x2 + y2 + z2 ), and

A dot B is the dot product of
vectors A and B, given by A_x*B_x + A_y*B_y + A_z*B_z
----------------------------------------------------------------*/
void getJointAngles() {
 leftElbowAngle = acos((leftShoulderElbowX
 *leftWristElbowX+leftShoulderElbowY
 *leftWristElbowY+leftShoulderElbowZ
 *leftWristElbowZ)/(sqrt(leftWristElbowX
 *leftWristElbowX+leftWristElbowY
 *leftWristElbowY+leftWristElbowZ*leftWristElbowZ)
 *(sqrt(leftShoulderElbowX*leftShoulderElbowX
 +leftShoulderElbowY*leftShoulderElbowY
 +leftShoulderElbowZ*leftShoulderElbowZ))));

 rightElbowAngle = acos((rightShoulderElbowX
 *rightWristElbowX+rightShoulderElbowY
 *rightWristElbowY+rightShoulderElbowZ*
 rightWristElbowZ)/(sqrt(rightWristElbowX
 *rightWristElbowX+rightWristElbowY*
 rightWristElbowY+rightWristElbowZ*rightWristElbowZ)
 *(sqrt(rightShoulderElbowX*rightShoulderElbowX
 +rightShoulderElbowY*rightShoulderElbowY
 +rightShoulderElbowZ*rightShoulderElbowZ))));

 leftKneeAngle = acos((leftHipKneeX
 *leftFootKneeX+leftHipKneeY
 *leftFootKneeY+leftHipKneeZ
 *leftFootKneeZ)/(sqrt(leftFootKneeX
 *leftFootKneeX+leftFootKneeY
 *leftFootKneeY+leftFootKneeZ*leftFootKneeZ)
 *(sqrt(leftHipKneeX*leftHipKneeX
 +leftHipKneeY*leftHipKneeY
 +leftHipKneeZ*leftHipKneeZ))));

 rightKneeAngle = acos((rightHipKneeX
 *rightFootKneeX+rightHipKneeY
 *rightFootKneeY+rightHipKneeZ
 *rightFootKneeZ)/(sqrt(rightFootKneeX
 *rightFootKneeX+rightFootKneeY
 *rightFootKneeY+rightFootKneeZ*rightFootKneeZ)
 *(sqrt(rightHipKneeX*rightHipKneeX
 +rightHipKneeY*rightHipKneeY
 +rightHipKneeZ*rightHipKneeZ))));

 // print angles
 print("Left elbow angle: ");
 println(leftElbowAngle);
 print("Right elbow angle: ");
 println(rightElbowAngle);
 print("Left knee angle: ");
 println(leftKneeAngle);
 print("Right knee angle: ");
 println(rightKneeAngle);
} // void getJointAngles()

&nbsp;
<pre>

Kinect Skeleton Tracking

Untitled

This program uses the depth sensors to track multiple users. It will display the skeleton structure on a tracked user. The “confidence level” is a number between 0 and 1 that the kinect will output to show how confident it is tracking that user. I’ve set the confidence level to be at least 0.5 to start tracking the skeleton. The program will select a color to draw based on the user id and draw the skeleton over the user being tracked. On the previous version of SimpleOpenNI, you had to use the “psy” pose to calibrate the tracked user. This is no longer needed and the user can be instantly tracked as soon as it enters the kinect’s field of vision.

/*---------------------------------------------------------------
Created by: Leonardo Merza
Version: 1.0

This class will track skeletons of users and draw them
----------------------------------------------------------------*/

/*---------------------------------------------------------------
Imports
----------------------------------------------------------------*/
// import kinect library
import SimpleOpenNI.*;

/*---------------------------------------------------------------
Variables
----------------------------------------------------------------*/
// create kinect object
SimpleOpenNI  kinect;
// image storage from kinect
PImage kinectDepth;
// int of each user being  tracked
int[] userID;
// user colors
color[] userColor = new color[]{ color(255,0,0), color(0,255,0), color(0,0,255),
                                 color(255,255,0), color(255,0,255), color(0,255,255)};

// postion of head to draw circle
PVector headPosition = new PVector();
// turn headPosition into scalar form
float distanceScalar;
// diameter of head drawn in pixels
float headSize = 200;

// threshold of level of confidence
float confidenceLevel = 0.5;
// the current confidence level that the kinect is tracking
float confidence;
// vector of tracked head for confidence checking
PVector confidenceVector = new PVector();

/*---------------------------------------------------------------
Starts new kinect object and enables skeleton tracking.
Draws window
----------------------------------------------------------------*/
void setup()
{
  // start a new kinect object
  kinect = new SimpleOpenNI(this);

  // enable depth sensor
  kinect.enableDepth();

  // enable skeleton generation for all joints
  kinect.enableUser();

  // draw thickness of drawer
  strokeWeight(3);
  // smooth out drawing
  smooth();

  // create a window the size of the depth information
  size(kinect.depthWidth(), kinect.depthHeight());
} // void setup()

/*---------------------------------------------------------------
Updates Kinect. Gets users tracking and draws skeleton and
head if confidence of tracking is above threshold
----------------------------------------------------------------*/
void draw(){
  // update the camera
  kinect.update();
  // get Kinect data
  kinectDepth = kinect.depthImage();
  // draw depth image at coordinates (0,0)
  image(kinectDepth,0,0); 

   // get all user IDs of tracked users
  userID = kinect.getUsers();

  // loop through each user to see if tracking
  for(int i=0;i<userID.length;i++)
  {
    // if Kinect is tracking certain user then get joint vectors
    if(kinect.isTrackingSkeleton(userID[i]))
    {
      // get confidence level that Kinect is tracking head
      confidence = kinect.getJointPositionSkeleton(userID[i],
                          SimpleOpenNI.SKEL_HEAD,confidenceVector);

      // if confidence of tracking is beyond threshold, then track user
      if(confidence > confidenceLevel)
      {
        // change draw color based on hand id#
        stroke(userColor[(i)]);
        // fill the ellipse with the same color
        fill(userColor[(i)]);
        // draw the rest of the body
        drawSkeleton(userID[i]);

      } //if(confidence > confidenceLevel)
    } //if(kinect.isTrackingSkeleton(userID[i]))
  } //for(int i=0;i<userID.length;i++)
} // void draw()

/*---------------------------------------------------------------
Draw the skeleton of a tracked user.  Input is userID
----------------------------------------------------------------*/
void drawSkeleton(int userId){
   // get 3D position of head
  kinect.getJointPositionSkeleton(userId, SimpleOpenNI.SKEL_HEAD,headPosition);
  // convert real world point to projective space
  kinect.convertRealWorldToProjective(headPosition,headPosition);
  // create a distance scalar related to the depth in z dimension
  distanceScalar = (525/headPosition.z);
  // draw the circle at the position of the head with the head size scaled by the distance scalar
  ellipse(headPosition.x,headPosition.y, distanceScalar*headSize,distanceScalar*headSize);

  //draw limb from head to neck
  kinect.drawLimb(userId, SimpleOpenNI.SKEL_HEAD, SimpleOpenNI.SKEL_NECK);
  //draw limb from neck to left shoulder
  kinect.drawLimb(userId, SimpleOpenNI.SKEL_NECK, SimpleOpenNI.SKEL_LEFT_SHOULDER);
  //draw limb from left shoulde to left elbow
  kinect.drawLimb(userId, SimpleOpenNI.SKEL_LEFT_SHOULDER, SimpleOpenNI.SKEL_LEFT_ELBOW);
  //draw limb from left elbow to left hand
  kinect.drawLimb(userId, SimpleOpenNI.SKEL_LEFT_ELBOW, SimpleOpenNI.SKEL_LEFT_HAND);
  //draw limb from neck to right shoulder
  kinect.drawLimb(userId, SimpleOpenNI.SKEL_NECK, SimpleOpenNI.SKEL_RIGHT_SHOULDER);
  //draw limb from right shoulder to right elbow
  kinect.drawLimb(userId, SimpleOpenNI.SKEL_RIGHT_SHOULDER, SimpleOpenNI.SKEL_RIGHT_ELBOW);
  //draw limb from right elbow to right hand
  kinect.drawLimb(userId, SimpleOpenNI.SKEL_RIGHT_ELBOW, SimpleOpenNI.SKEL_RIGHT_HAND);
 //draw limb from left shoulder to torso
  kinect.drawLimb(userId, SimpleOpenNI.SKEL_LEFT_SHOULDER, SimpleOpenNI.SKEL_TORSO);
  //draw limb from right shoulder to torso
  kinect.drawLimb(userId, SimpleOpenNI.SKEL_RIGHT_SHOULDER, SimpleOpenNI.SKEL_TORSO);
  //draw limb from torso to left hip
  kinect.drawLimb(userId, SimpleOpenNI.SKEL_TORSO, SimpleOpenNI.SKEL_LEFT_HIP);
  //draw limb from left hip to left knee
  kinect.drawLimb(userId, SimpleOpenNI.SKEL_LEFT_HIP,  SimpleOpenNI.SKEL_LEFT_KNEE);
  //draw limb from left knee to left foot
  kinect.drawLimb(userId, SimpleOpenNI.SKEL_LEFT_KNEE, SimpleOpenNI.SKEL_LEFT_FOOT);
  //draw limb from torse to right hip
  kinect.drawLimb(userId, SimpleOpenNI.SKEL_TORSO, SimpleOpenNI.SKEL_RIGHT_HIP);
  //draw limb from right hip to right knee
  kinect.drawLimb(userId, SimpleOpenNI.SKEL_RIGHT_HIP, SimpleOpenNI.SKEL_RIGHT_KNEE);
  //draw limb from right kneee to right foot
  kinect.drawLimb(userId, SimpleOpenNI.SKEL_RIGHT_KNEE, SimpleOpenNI.SKEL_RIGHT_FOOT);
} // void drawSkeleton(int userId)

/*---------------------------------------------------------------
When a new user is found, print new user detected along with
userID and start pose detection.  Input is userID
----------------------------------------------------------------*/
void onNewUser(SimpleOpenNI curContext, int userId){
  println("New User Detected - userId: " + userId);
  // start tracking of user id
  curContext.startTrackingSkeleton(userId);
} //void onNewUser(SimpleOpenNI curContext, int userId)

/*---------------------------------------------------------------
Print when user is lost. Input is int userId of user lost
----------------------------------------------------------------*/
void onLostUser(SimpleOpenNI curContext, int userId){
  // print user lost and user id
  println("User Lost - userId: " + userId);
} //void onLostUser(SimpleOpenNI curContext, int userId)

/*---------------------------------------------------------------
Called when a user is tracked.
----------------------------------------------------------------*/
void onVisibleUser(SimpleOpenNI curContext, int userId){
} //void onVisibleUser(SimpleOpenNI curContext, int userId)

User Center of Mass Tracking

This program uses the depth sensors to track the center of gravity of tracked users.  It will display a circle on the user’s center of gravity and display each tracked user’s center of gravity vector position.

/*---------------------------------------------------------------
Created by: Leonardo Merza
Version: 1.0
----------------------------------------------------------------*/

/*---------------------------------------------------------------
Imports
----------------------------------------------------------------*/
// import kinect object
import SimpleOpenNI.*;

/*---------------------------------------------------------------
Variables
----------------------------------------------------------------*/
// create kinect object
SimpleOpenNI kinect;
// current position of current user
PVector position = new PVector();
// current kinect image storage
PImage kinectImage;
// create new user list
IntVector userList;
// size of dot to show center of mass position
int dotSize = 10;
// user colors
color[] userColor = new color[]{ color(255,0,0), color(0,255,0), color(0,0,255),
                                 color(255,255,0), color(255,0,255), color(0,255,255)};
/*---------------------------------------------------------------
Setup method. enables kinect and tracking. Creates draw window
----------------------------------------------------------------*/
void setup() {
  // create new kinect object
  kinect = new SimpleOpenNI(this);
  // enable depth sensor
  kinect.enableDepth();
  // enable rgb sensor
  kinect.enableRGB();
  // enable tracking but with no joint tracking
  kinect.enableUser();
  // start new user list
  userList = new IntVector();
  // create window size of depth info
  size(kinect.rgbWidth(),kinect.rgbHeight());
} // void setup()

/*---------------------------------------------------------------
Draw method. Updates kinect and displays it. Draws red dots at
tracked user's center of gravity
----------------------------------------------------------------*/
void draw() {
  // update kinect
  kinect.update();
  // get current image from kinect
  kinectImage = kinect.rgbImage();
  //draw current image at (0,0)
  image(kinectImage, 0, 0);

  //get list of tracked users
  kinect.getUsers(userList);

  //for each tracked user find center of gravity
  for (int i=0; i<userList.size(); i++) {
    // get current userID
    int userId = userList.get(i);

    // get center of mass of user (CoM) if available
    if(kinect.getCoM(userId, position)){
      // convert coordinates to projective space
      kinect.convertRealWorldToProjective(position, position);

      //print out current hand's id# and position
      println("userId: " + userId + ", position: " + position);

      // change draw color based on hand id#
      stroke(userColor[(i)]);
      // fill the ellipse with the same color
      fill(userColor[(i)]);
      // draw ellipse at x/y coordinates given ellipse size
      ellipse(position.x,position.y,dotSize,dotSize);

    }//if(kinect.getCoM(userId, position))
  } // for (int i=0; i<userList.size(); i++)
} // void draw()

Create a free website or blog at WordPress.com.

Up ↑