Sunday, 11 December 2011

Heating Mat DIY (part 1)

Appliances for science are overpriced. Massively. One of those things are heating mats which don't do much other than take the temperature, compare it to the set point and regulate the heat output accordingly. This might sound complicated, but the controller units doing just that have been around for a long time and hence are not very expensive. Certainly not as expensive as companies try to make us believe. Labrigger has a post on DIY heating pads made from industrial parts as made by Taro Ishikawa which I found very interesting and ingenious. And so I embarked on a little adventure and made one for myself.

Things did turn out to be a little tricky at times especially due to my rather rusty knowledge of circuits. Luckily a friend from a collaborating group was happy to explain even the most basic principles of electric circuits with great patience and assisted me as I put everything together. I've bought all my parts from RS Components, in many ways the british equivalent of McCaster-Carr mentioned in the original post.

First off, the idea: What we want is a system that keeps the mouse at a certain temperature, say 38°C. In more technical terms we can say that we have a set point of 38°C and a measured variable (i.e. body temperature) that we can manipulate so it stays at the set point. We manipulate it by changing the temperature of the heating mat. Physiologists will probably think of a simple negative feedback mechansim here, and that's exactly what it is. This could be done manually: just get a thermometer and a heating mat with adjustable heat-output and regulate the heat up or down depending on the body temperature. In fact, many companies sell exactly that kind of system for hundreds of pounds (if not thousands). A better system would be to have a controller that regulates heat-output automatically so we can concentrate on whatever we are doing (surgery for example). The latter is the kind of system I've been working on.

The principal components of the system are:

PID Controller (Control Unit)
In industry, control units that keep a variable at a certain setpoint have been around for a long time, the most abundant of which are PID controllers. I've bought one with a linear DC output (0-10V)  as opposed to most standard models wich come with relays only (RS 701-2788). This would give me the necessary fine-control for a heating mat.

What a beauty. Here you can see it with the mains supply connected.
Heating Mat (actuator)
The heating mat is a rather simple thing, it turns electricity into heat. Wow. 
Also bought from RS (245-528), 50x100mm with a maximum of 5W output at 12V (DC of course). Maximum temperature: 200°C - much more than we need.
  
Thermocouple (sensor)
We use a thermocouple as our temperature indicator. The working principle here is that it will change its voltage depending on the temperature between two conductors. All we need to know is that there are different types and that we need to buy one that our PID controller can read (in our case it can read most types).

In the pictures below you can see a K-type thermocouple, but a T-type probe suitable for mice has been ordered and can easily be replaced.

Non-Inverting Amplifier
The output on our PID controller is 0-10V maximum, but we want 12V. I realise I could probably get away with just using the 0-10V range, but I don't like operating electrical components at their ouput limits. Therefore I put together a simple non-inverting amplifier. The central component is an operational amplifier (RS 652-5678):

The wiring diagram for the opamp.



Building Notes
Here is a quick documentation of the building process:

First, connecting the mains. Care has been taken to
not expose the contacts too much.
It's alive!
The connector on the bottom left is the thermocouple. The two green wires are + and - of the linear output.

In red is the current temperature reading. Below the default set-point of -128.8°C (irrelevant at this stage). Mind you, the reading comes from a work bench heated up by the sun. This is Scotland in December. It's not 23.6°C. Anywhere.


Just to double check, I connected the PID controller to my PC via a NI DAQ and read the linear output. What you see are slow ramps as I increase and decrease the set-point. Clearly, this signal needs some cleaning up, so I put bypass capacitors and a RC low-pass filter between the PID and the opamp. Yes, I took a photo of my screen instead of taking a screenshot. Lazy logic.
Here is a total of the circuit. It's really just a basic non-inverting amplifier with the aforementioned components. The black heatsink in the middle is required to keep the opamp from overheating.
Here you can see the thermocouple taped to the heating mat on the right, and the PID controller on the left.
I also used an oscilloscope (left) to monitor either the output of the PID controller or the amplifier which helped keeping an eye on voltages in the system and finding mistakes.

Next Steps
The current system allows us to keep the heating mat at a certain temperature which is already quite handy. I will go on and tune this system so I can use it in combination with a thermocouple that takes the mouse body temperature directly and regulates the heating mat accordingly. I had no idea how much science has gone into tuning-algorithms for PID controllers (there are actual patents on some), but with a lot of care this should be doable. Bear in mind that we won't use this heating mat around other electrically sensitive equipment, so no precautions to shield mains noise have been taken (yet). Also, I will soldier all this onto a stripboard and box it up so it looks nice and is safe when used with animals, expect an update after christmas. 














Tuesday, 22 November 2011

More Aperture Science

We do what we must because we can. And so I finally constructed the long promised aperture that blocks any unwanted light and shadows from reaching the screen. Up to now I had a brightly lit rectangle with a round shadow on the screen. While it was clear to me that this would be gone in the final version it did confuse many people I presented my work to. Below is a picture of that effect.

Click on the picture to see it full size. To make it clearer I drew a red rectangle around the overshooting light.

 The initial plan was to calculate the exact shape I need to only let light through that will also hit the mirror (see this post for details). However, since the light at the level of the aperture is still unfocused it was difficult to calculate exactly how big the opening had to be to catch all of the overshooting light. Therefore, I decided in favour of a Thorlabs aperture with a maximum opening of 75mm diameter. This way I could close it as far as necessary. It is mounted on the poles that also hold the flat mirror. At this location it is as close to the mirrors/screen as possible (with respect to the path of light), without interfering with the VR projection.


Thorlabs aperture fully opened

The aperture has just the right size. If it had been too small I could have mounted it a bit closer to the projector. This would have meant the picture is even more unfocused which in turn means I would also block more light that is meant for the screen and thus have less contrast/brightness.

The reason why we can't just tape some cardboard on the back of the round mirror to catch excess light is because the light deflected from the convex mirror (better known as Angular Amplification Mirror or AAM, see previous posts for more information) goes right past it. 

Circled in red shows how tightly the light from the AAM shoots past the flat mirror. Adding cardboard here to catch excess light would inevitably block some of the virtual reality as well.
As you can see in the picture of the aperture, the frame of the aperture isn't big enough to block all of the light, so I repurposed an old diary of a colleague (hope she'll never read this) and turned its hardback cover into bits of aperture. In the end it actually looks a lot more professional than it really is.

What a nice day it was indeed.


A masterpiece of handcrafting. My primary school teacher would be proud.
Top bit mounted.
The bottom bit are just two corners of the same material stuck to the aperture with double-sided tape. As you can see in the bottom left corner, to light is overshooting the round mirror now.

With that done the virtual reality looks a lot better and less confusing. The virtual reality is starting to look pretty good now and I keep wondering whether there is a way to upscale this for human use. Not for science, that is.

Wednesday, 19 October 2011

More pictures on Motion Tracking

I didn't show any up close pictures of the new holders in the previous post, here are a few:




 Any questions? Leave a comment or e-mail me.

Monday, 17 October 2011

Apples and Oranges that look like Apples (Motion Tracking 2)

if you haven't done so, read this post first to make more sense the following one.

One of the problems with my initial solution was that the holder for the optical mouse was too big and collided with the treadmill base when I tried to position it. To circumvent this problem I designed a new optical mouse holder that would have a smaller spatial profile. Here is the result:

Computer mouse holder with reduced spatial profile.
When it arrived it fit very well around the optical mouse. Mounting it on my setup worked very well too and it allowed me to mount the mouse where I needed it. In fact it was so nice I wanted to order another one straight away without waiting for the second Dell mouse to arrive. Being overly careful as I am I decided to wait for it, and that was a good decision...

When the new mice arrived I was in for a surprise. I openend it and looked whats inside. The exterior is, for all intents and purposes the same, the inside however isn't. First warning: looking at its USB signature revealed the new mice were made by a different manufacturer. Turning the new mouse on its back showed a slightly different sticker informing you about the technical details nobody really cares about.

Dell mice. The one on the left is the one I gutted first, on the right is a similar looking mouse but with different interior.
The sticker on its belly is the only visible difference from the outside.

The interior is significantly different in shape. The good thing at least was that sensor and lens were the same.
Much worse than the different shapes was the fact that the new mouse would not cooperate with the python code I've developed over about two weeks. If you've tried to read directly from the USB bus before you'll know what a massive pain it is to get the operating system to cooperate with you. More on that in a future post on the software side of the motion tracking. I tried desperately to find the exact same mouse in one of the other offices and exchange it, but, as it turns out, the one I needed was only delivered in a very small time window and is now phased out. Sigh. In any case, this unexpected non-compliance of reality with my expectations caused me to lose about one week to find a new way to read from two mice independently and re-model the mouse holder.

The mouse holder for the new mouse. Small differences, but the re-design nevertheless cost some time.

With all that sorted out I was finally able to mount the second mouse on the setup and test it. Everything worked fine. I have yet to test the spherical treadmill with the virtual reality because I prefer to keep the computer that is running the virtual reality in the office until I finalised development of the virtual reality. Otherwise I'd have to sit in the isolated lab for the rest of the development.

Two mice mounted around the treadmill.
The only thing still missing now is the reward system. I've been waiting for that order for almost two months now and if that company wouldn't be the only one I found that makes those particular valves I would have rejected that order weeks ago. With a bit of luck there will be a post on that soon. Also, I will post information about the software side for the motion tracking system in the not-so-distant future.

Thursday, 15 September 2011

If you don't like what you see, buy a new mirror

Household mirrors are designed to withstand the usual attacks by toddlers/angry teenagers/flying objects and for that end are back-surface coated. This means that you have a sheet of glass on the back of that sheet is the refelctive surface. That way, the reflective coating is protected by the glass which can be conveniently wiped down. This feature however makes it unsuitable to deflect the picture of a projector.

At the point where the projector output hits the mirror it is still unfocused. Before and after the unfocused picture is reflected by the reflective surface it is refracted by the glass. This has the effect of the picture not focussing correctly on the screen anymore, also known as 'ghosting' (see picture below). I didn't do the raytracing of how this happens because I feel exploring that issue in such depth would be a waste of time.

Ghosting effect when using a household mirror. Above and below the actual object are shadows which result from the unfocused picture not being deflected uniformely. Walls and everything else are equally affected, but it's less obvious.

To resolve this issue, we needed a front-surface coated mirror. These are by far more delicate as they can easily scratch. After some research I ordered one from Knightoptical with enhanced aluminium coating (just as a reminder, it's 140mm in diameter). This is, as far as I know, the cheapest coating to reflect the entire spectrum of visible light.


Front surface coated mirror. It even comes with a protective film!
Like the previous mirror, I mounted the mirror on an L-shaped piece of scrap metal. Just make sure to fasten the nut tightly so the mirror doesn't slip.

It is thicker (6mm) than the household mirror I had before, but, to my surprise, wasn't much heavier which I thought could be problem if the right-angle joints I'm using to hold the mirror are not strong enough.


Here is the result: No ghosting anymore. You still see a slight glow or blur around the edges in the picture but that's only the camera struggling with light shining into the lens. You can still see the light overshooting the mirror, but this will be resolved soon too.

On a sidenote, I've replaced the Thorlabs 90 degree angle joints with Newport joints because you can fasten them much tighter and they easily withstand knocks and vibration. The Thorlabs joints I had to tighten every so often because the screw that presses against the post to hold it comes loose easily. Generally I choose the brand depending on who I feel has the better solution for a given part.


Thorlabs (top) and Newport (bottom) angles. As you can see the Thorlabs joint only has a screw pressing against the post to fasten whereas the Newport joint actually clamps onto the post. Both pictures are property of the respective companies.

Results
With the new mirror installed the ghosting effect is completely gone. Further, because the formerly scattered light is now focused where it should be, contrast and brightness have improved noticeably. However I still think I need to do one or two things to improve picture quality, but this is definitely a step in the right direction.

Thursday, 1 September 2011

Computer Mouse Surgery (Motion Tracking)

Having a mouse run around the virtual reality requires tracking movement of the mouse (i.e. the treadmill) and updating the location in the virtual reality accordingly. The cheapest system for motion tracking are optical computer mice, placed stationarily at the equator of the spherical treadmill. Apart from being cheap, computer mice are also very easy to interface with the computer. Hölscher et.al. (2005) have succesfully used this system and so have Harvey et.al. (2009), and so will I.

One of the technical difficulties is that the lens has to be very close to the surface of the treadmill, and even small changes in distance can cause problems in tracking motion. Recently, mice with laser instead of normal light are becoming available which are more precise and also seem to tolerate changes of distance between sensor and surface better. After a few tests I decided to use one of the standard issue Dell mice:

The mouse used for motion tracking. Very cost efficient and shows good performance. Dell calls it almost affectionately "Dell Laser Scroll USB (6 buttons scroll) Black Mouse'

One of the potential downsides of this unit is that it has an inbuilt function to change cursor speed. This might cause problems later on because I need a 1:1 mapping between mouse-movement and input to the computer. I will explain this further in a later post.

First of all I had to remove the housing of the computer mouse and take out the parts that mattered, which is essentially just one PCB board. I would like to show you pictures of the gutting process, but alas I've done that to so many computer mice before this one that I stopped documenting it. My desk is a computer mouse graveyard. Soon enough I will need a second one of these though, at which point I will upload a post with pictures of every step.

After taking out the PCB board I also found out that this mouse has a very handy construction feature: the lens for the laser and camera is directly mounted on the chip (unlike most other mice where it is attached to the housing). So I can just take the PCB out and have a fully functional mouse:

The underside of the computer mouse PCB. In the centre you can see the lens.
To allow the mouse to track the motion of the treadmill we need to bring the bottom surface of the lens as close as possible to the surface of the ball without touching it. To do that we need something that holds the PCB board. I guess it's redundant to say that the combination of 3D-modelling software and 3D-printing came in very handy here.

A 3D model of the mouse holder. The pole at the bottom is 1/2" in diameter so I can mount it like any normal post on the airtable. The other features are easier to explain when you see the finished thing.



The PCB board slots in like this, exposing the lens quite well. I only had to make minor adjustments because apparently my measurements were not 100% correct.
The PCB board is fixed like this. A screw, that just happens to have exaclty the right length, presses down at the board at a point where there are no components. I realise the screw can come lose quite easily, should that become a problem I will just put some Loctite on the thread.
Because 3D printing is neither precise enough nor strong enough to make a thread for a screw I left a slot for a nut. To my surprise it fitted in absolutely beautifully, no need for glue or anything. The cutout on the top right makes space for the USB cable.

You can see a small rectangle cut out in the picture above, this is where the connector for the USB cable is located. Simply plug in the cable and then connect it to the computer like you do with a normal mouse (see picture below).

Results

After putting everything together there wasn't much to do other than to mount it on the airtable and see how it works. The mounting angle turned out to be a problem here. The base of the treadmill didn't leave enough space to mount the mouse straight on the equator. This meant I had to angle the computer mouse a bit and mount it further up which in turn required me to mount it vertically rather than horizontally as I had .planned initally. This means I needed more space for the posts, which is suboptimal because there isn't much space left (micromanipulator) and right (head restraint). The next generation of the 3D-model will resolve this problem, at a price of less than £100,- per 3D-print, one or two protoypes are within budget.

A first test showed that movement along the horizontal axis is picked up very well (i.e. running forward and backward), however, movement across is tracked rather poorly. This means I will need a second mouse mounted 90 degrees to the current mouse to pick up sideways movement, as it has been done in the two setups before me (Hölscher et.al. 2005 and Harvey et.al. 2009).

 
This is how it is mounted right now. Not ideal, but this will be resolved in the next generation of the 3D-model of the holder. The clearance is as small as possible, but small movements of the ball have to be taken into consideration.

Update
Here is the link to the follow up post: http://mousevr.blogspot.com/2011/10/apples-and-oranges-that-look-like.html

References

1. Harvey CD, Collman F, Dombeck DA, Tank DW. Intracellular dynamics of hippocampal place cells during virtual navigation. Nature. 2009;461(7266):941-6. Available at: http://www.ncbi.nlm.nih.gov/pubmed/19829374 [Accessed September 20, 2010].

2. Hölscher C, Schnee A, Dahmen H, Setia L, Mallot H a. Rats are able to navigate in virtual environments. The Journal of experimental biology. 2005;208(Pt 3):561-9. Available at: http://www.ncbi.nlm.nih.gov/pubmed/15671344 [Accessed June 15, 2011].

Saturday, 6 August 2011

Virtual Reality Virtually Complete

Many hours were spent finding the correct image transformation, but it seems like I've finally made some significant progress. A fair amount of fine tuning is still required, but the transformation and setup seem to be sound.

Virtual Reality in all its glory.
To create and and run the virtual reality, Blender 3D (an open source software, free for anyone) is used which further allows me to warp the output. The image transformation function used by us has been contributed to Blender 3D by Dalai Felinto, based on work by Paul Bourke, both of whom can't go unmentioned here. Dalai helped me directly with advice about image transformation and Blender 3D. Further, Hans-Jürgen Dahmen (not mentioned for the first time in this blog), kindly provided me with information on the image transformation used in his setup (Hölscher et. al. 2005).

The Setup
As described previously in this blog, the virtual reality  has three components: projector, mirror system and screen. The dimensions and specifications of all of those components are crucial to find the right transformation. But even after all the calculations were done it wasn't easy to arrange all the parts exactly. Mounting the projector in just the right place isn't easy, neither is getting the mirrors at the correct angle and distance relative to each other.

Projector mount. Quite wobbly, but works if nobody touches it... or exhales too close to it.
The projector has to be positioned at the point of origin of the ray (top left, where the yellow lines meet)

Projector and screen. On the right you can see the pressursied air line taking an adventurous route across the room before it feeds into treadmill and airtable.
We are using a Vision Techmount ceiling bracket with a 500mm extension pole. It does the job, but a fair amount of play means I will have to go back and stabilise it, otherwise it will be too wobbly. That shouldn't be too difficult though.

Screen and mirror construction are described in detail here. 

First tests have shown that the material the screen is made of (canvas paper) doesn't have an ideal surface texture which I think decreases sharpness/contrast. Further, you can see in the picture above that  the borders between the paper strips are quite visible which might provide confounding visual stimuli to the mouse. I am looking into ways of mitigating this effect.

The mirrors have been arranged previously to mounting the projector. However, minor adjustments to the flat mirror could be done to correct for slight misalignment of the projector without introducing a noticeable error in the virtual reality.
 
Image Transformation
This part is what gave me a headache for a while, not because it's so difficult but because I'm very incompetent when it comes to geometry. Paul Bourke's website (link) contains a very comprehensive description of his virtual reality system (which is different to our system but the explanations are nevertheless very helfpul). More than that, the image tranformation functions for his system are available in Blender 3D and can be adapted to work for our system.

The goal is to get a 360° view in a ring-shape. A ring because the centre of the projector output hits the apex of the convex mirror. Concentric rings around the centre translate into horizontal lines on the screen. I hope that makes sense, see pictures below if it doesn't.

To achieve this, the 'spherical panoramic' function of Blender 3D, and the input-output warpmesh are crucial for my application. The spherical panoramic creates a 360° view taken with 6 virtual cameras and stitches them together (similar to a time-lapse panoramic). In a second transformation, an input-output mapping is applied to the picture. In other words, two meshgrids are created: meshgrid A is placed over the spherical panorama and meshgrid B has the shape I need. Each node in grid A has a corresponding node in grid B and the picture is warped into the shape of grid B. I hope this will make more sense after my illustration with many picture below:

This is a total of the maze. The cube is where the observer is positioned, facing the red wall.
This is the normal perspective view as it appears on the screen initially. Nevermind the monkeyface.
First transformation: the spherical panoramic. This is an inbuilt function of Blender 3D which makes things a lot easier. If you don't know how this picture is created try this: stretch your arms out forward and with the middle fingertip of your left hand touch the middle fingertip of your right. Now make a ring with your arms by bending your elbows to the side while your fingertips still touch. Imagine an observer standing in the middle of that ring looking at your chest seeing only what's in front of him/her. Now in a final step, part your finertips and stretch your arms to either side. The observer can see 360° in front of him/her now (your hands correspond to the green wall in this case).

After this comes the crucial step: to translate the spherical panoramic into a ring-shape we use two meshes, one is laid over the spherical panoramic and the other has the desired shape.

The square grid is laid over the spherical panoramic. Each node has a corresponding node in the grid below. I've reduced the number of nodes to 12x12 for illustration the actual warpmehses used have 48x48 nodes.
The ring-shaped mesh. The bottom left node of the square grid corresponds to the bottom node of the inner-most ring of this grid. From there it goes around counter-clockwise.
And this is the result. The picture is upside down which has got to do with the fact that the projector is mounted upside down.

The maths behind this is reasonably simple and can be found on Paul Bourke's website (link). First, the square meshgrid is constructed. In our example above it is 12x12, note though that it is actually 12x13, the reason for this is that in the circular meshgrid we have to come full circle, thus the first and the last point on each ring are the same.

For the circular meshgrid one can simply take the square coordinates and calculate sinus and cosinus respectively (more information here):

transmatx(i,j) = y_radmap * cos(theta);
transmaty(i,j) = y_radmap * sin(theta);

(you can find the full listing on the "gridscript listing", found in the blue bar running across the top of the page)


There is more to say about how the transformation is implement in Blender 3D, but I don't feel like writing an essay about this here. If anyone has questions feel free to e-mail me (address is at the top right of this website).

Final Words
Image transformation and virtual reality construction are complete now, what's left to do is a fair amount of fine-tuning. How much fine tuning will depend not only on my subjective sense of what is sufficiently good but will also be based on first tests with animals. After the amount of work and research that has gone into that system I'm happy to see it finall working.

Here are a few impressions of the glowing ball:





References:
Hölscher, C., Schnee, A., Dahmen, H., Setia, L., & Mallot, H. A. (2005). Rats are able to navigate in virtual environments. Journal of Experimental Biology, 208(Pt 3), 561-569. Co Biol. Retrieved from http://www.ncbi.nlm.nih.gov/pubmed/15671344