Stabilizing two wheeled vehicle.

For a control systems project course we build a self stabilizing two wheeled robot inspired by the segway platform.

The device.


The two wheeled device was built earlier by the control systems group and was equipped with a motor, tachometer, gearbox, frame and wheels. We added an accelerometer, the necessary hardware and an Arduino to control it.
From the start there were problems with the gears. They were worn, and the axis allowed for some
movement. This caused the gears locking up frequently. Disassembling the gearbox and reassembling in
a way the gears move more freely fixed this problem. Another problem, the bearings the main wheel axis is mounted
through aren’t secured to the frame and fall out easily when a lateral force is applied to the wheels.

A schematic overview

Arduino connections overview

The tacho outputted a -4/4V signal. A voltage summer was made to change this into 0-5V for the Arduino to read out. A low pass filter was made for the accelerometer output, as high frequency vibrations were not desired in the output signal.

Voltage summer


We decided to use a PD controller to control our inverted pendulum. We did this for two reasons; the first being that our systems already has two integrators inside of it in the forms of the measurement/calculation of the absolute angle compared to the angle acceleration. The second reason was simply because this provided the best results. In our final code we used a Kp value of 1 and a Kd value of 0.2. These values were obtained through testing.


In the end the inverted pendulum was neither a complete failure nor a complete success. Due to a lack of time and the underestimated complexity of the task we couldn’t manage to stabilize the inverted pendulum as planned. However we managed to get the pendulum upright for at least a few seconds. The reasons for not being able to stabilize it completely were the fact that there were a lot of delays in
the system. This is mainly due to the friction in the gears, the slipping of the wheels and the Arduino code that did not manage to run at 1KHz. However there were many different other factors that influenced the system such as high frequency vibrations in the accelerometer, environmental noise in form of wind and air flow, conversion problems from the ADC and worn out gears.
Concluding we can say that the project was an interesting experience which made us realize how different a practical stabilization process is from a theoretical one.
Furthermore it can be said that to control the inverted pendulum further users of our installation would have to improve the code or use an Arduino with higher processing power and the gears should be fixed to reduce friction to a minimum.

Stimulating litter removal in community rooms through interactive trash cans

Schematic of experiment setup

The course Cyber Crime Science had a final project where we had to identify a local crime and find a technical solution for it. We chose to tackle the crime of littering in the student association room.

To encourage throwing away trash we devised a system where the occupants of the room were given a choice between two trash cans, a red one and a blue one. A screen nearby informed the occupants of the current topic being voted on. This topic was always chosen to be popular and controversial, such as ‘PC vs Mac’ ‘Cola vs Pepsi’.
The trashcans were equipped with a distance sensor in the opening, thus detecting pieces of trash. Results were shown on the screen. We hypothesized that by giving people the means to express their opinion the trashcans were more actively used.

The experiment design involved asking groups of two people to solve a puzzle while handed a drink and a plastic wrapped snack. After they solved the puzzle we asked them to wait until they finished their drinks and snacks. Then they were notified of the trash can system by a sound byte. Their disposing behavior was recorded and afterwards a questionnaire was given.

We observed different littering behavior and more trash can usage in the intervention group that were exposed to the trash can system, in contrast to the control group that was not.

More detailed information available in the presentation slides.

Puff and Sip virtual Glassblowing hardware controller.

By Robert Wendrich from Raw Shaping Technology I was commissioned to create a tangible user interface for a glassblowing like interactivity with the computer. This hybrid interface has great potential as an interface for an computer as human mouth output has a very high fidelity, and everyone is able to modulate it very precisely.

Device overview, showing generated shape in Blender, visualizer and the device itself.

The device is based on an Arduino. The Arduino gathers data from several sensors and sends those wireless to the PC. The sensors chosen to give the user a glassblowing like experience are a pressure sensor and an air flow sensor. Both of these sensors are ranged for human mouth output and give fantastic output. The device is also equipped with a orientation sensor to send the orientation angle values of the device to the PC, this data can be used to simulate a virtual glass blowpipe.

The scope of this project involved a hardware prototype, and a software visualization. This visualization is just to show the data, it is not a full simulation of molten glass, as that would be beyond the scope of the project.

Visualization software.

The software is divided into five different sections displaying the data in different ways. The visualization is written in the Processing language, a language made for graphical programming. The first display shows the data from the pressure and flow sensors in a graph, the second display shows a sphere showing a cumulative value based on the sensor values, this gives a sense of volume. The third display are just the latest numbers received, handy for debugging. The large display below shows a tube that is continually drawn using the latest data, its diameter is set by the pressure, flow or cumulative values. This visualization serves as a raw design tool, and shows using the data from the device is interesting and accurate enough to create shapes. These shapes can be recorded and exported as a mesh data file. This file can be opened in the 3D suite Blender with a script to further iterate the design. The final visualization is a bar representing the orientation of the device.

Below is a video presentation of the device, also going in detail over the internals.