| Lost password
411 users onlineYou are not loggend in.  Login
Looking for your input.
 1 2
 1 2
12/04/2009 11:44
mahoon 
12/04/2009 11:44
mahoon 
Re: Looking for your input.

A short video of the hand being controlled by the glove:

http://www.youtube.com/watch?v=vhf_scKoFwU


04.12.09
Can now control the pitch of the on-screen hand using the glove. Undecided whether to limit the amount of pitch up and down – don't really want the hand bending up and over backwards, but think natural hand limitations may make this unlikely. Will test to see whether it's needed. Cannot make anything meaningful from the az axis. I'm reading the right bytes, but don't really see too much change in the values, no matter how much I move the glove around. Needs further investigation.

02/04/2010 14:38
mahoon 
02/04/2010 14:38
mahoon 
Re: Looking for your input.

Added a video of the glove being able to read which fingers can flex or extend.

http://www.youtube.com/watch?v=A4zghzCdxok

Have also updated the project log:
Progress has been slow for the last couple of months as I've had to concentrate on other assignments with shorter deadlines. Current state of play is that the glove can now read which fingers can flex and extend – if a finger can flex, a variable is set to 1. If it can also extend, another variable is set to 1. The plan now is to collate those variables and see which fingers can flex/extend and which can't and then use other variables to compensate for the fingers that can't flex/extend. For example if the thumb can flex/extend but the little finger can't extend, assign the variable for extending the thumb to the little finger. This will mean the little finger mimics the thumb's flexibility. Once this is all working as intended I will then devise a method of saving/loading a person's profile so they only have to calibrate the glove once.

02/04/2010 15:14
jimh 
02/04/2010 15:14
jimh 
Re: Looking for your input.

I could at least watch the virtual hand on the screen and pretend my fingers were straight again

02/05/2010 15:43
mahoon 
02/05/2010 15:43
mahoon 
Re: Looking for your input.

Have the compensation system working (not fully comprehensive, but it compensates for little fingers being unable to extend...) and yes! I can move my virtual hand around and pretend that I, too, have hands that work properly once again

02/06/2010 00:12
jimh 
02/06/2010 00:12
jimh 
Re: Looking for your input.

The January issue of National Geographic has an artificial hand on the cover, and an article about "bionics". Lots of great stuff happening, but only in labs of course. They can now give a person some useful control of an electomechanical hand or arm, by connecting remaining nerves from higher up in the arm or shoulder, and letting the brain re-map itself to the new control pathways.

02/10/2010 11:48
mahoon 
02/10/2010 11:48
mahoon 
Re: Looking for your input.

Updated project log:
5.02.10
Have devised a system whereby a user can save and load profiles. When a new user uses they system, they are prompted for a profile name. This is stored as a string until the saveprofile() function is called (after 20 seconds of flexing/extending whatever fingers the user can) and each finger's ability to flex/extend is saved as a variable under that profile name. The values are all written out at once and the user simply chooses the name of their profile when prompted to at the start of their next session. Encountered a few problems involving skipping the compensation system if a user already had a profile but solved this by simply adding variables to the compensation functions and these are also saved in the user's profile so the program knows which functions to run if a profile has already been saved:
code example:
if saveprof=0 and compensated=0
movehand()
endif
Which simply means: 'If the saveprof value =0 a profile has not been saved for the user, and if the compensated value is 0 the user has not used the compensation system and so run the movehand() function.
Then, after some time decrement routines are run and after the user has used the compenstion system and a profile has been saved, the code to acknowledge this is run:

if saveprof=1 and compensated=1

movecomphand()
endif
Which simply says 'If the saveprof value is 1, the user's profile has been saved and if the compensated value is 1 too, they've also used the compensation system and so the move compensated hand function should be run movecomphand().

Encountered problems displaying text – messages like 'Now using Thumb as the master' would not display. Solved – they were displaying but only once, this was because I was setting a variable to 1 if a finger/thumb was able to flex/extend and that finger/thumb was selected to be the master and I exited the function once the user had selected the master, so it was brreaking out of the function as soon as the variable was set to one and didn't have enough time to display the message. Have solved this by using a time system – the user has 10 seconds to flex/extend a finger/thumb to be used as the master and the message of which one they've selected (at the moment it's the first one that bends that's chosen as master) is displayed for a few seconds before exiting the function.

02/10/2010 11:51
mahoon 
02/10/2010 11:51
mahoon 
Re: Looking for your input.

Updated Project Log:
10.02.10
Can now let the user choose which finger/thumb to use as the 'master' to map any non-working fingers/thumb too. Encountered several problems implementing this as the wrong fingers were being mapped to the master. Solved this by ensuring a variable was set to 1 for the first finger/thumb to extend/flex and thus forced all the other fingers/thumb to have a value set to 0 for their ability to extend/flex. This means that only one digit will have a value of 1 and all the others have a value of 0. The program now simply checks which finger/thumb has a value of 1 and, as long as all the others are set to 0, will adjust the values read from the glove to the new values given to it. Here's a code example from the movecomphand() function:

f1 = bytes(9) + ((bytes(10)&&0x03)<<8)-120
f2 = bytes(11) +((bytes(12)&&0x03)<<8)-150
f3 = bytes(13) +((bytes(14)&&0x03)<<8)-120
f4 = bytes(15) +((bytes(16)&&0x03)<<8)-150
f5 = bytes(17) +((bytes(18)&&0x03)<<8)-120

`map first finger to thumb
if tbmim=1 and ffpOK=0 or ffext=0 and ffmim=0 and sfmim=0 and tfmim=0 and fofmim=0
f2=f1
endif
This means: f1 is the value of bytes 9 and 10 (after a bit of complicated maths) -120, while f2 is the value of bytes 11 and 12 (again after a bit of maths jiggery-pokery) - 150. `map first finger to thumb is just a comment to remind me what this bit of code does while
if tbmim=1 and ffpOK=0 or ffext=0 and ffmim=0 and sfmim=0 and tfmim=0 and fofmim=0
means: 'If the thumb has been selected as the master, the value of tbmim will be 1. If the value of tbmim is 1 and the value of the first finger being able to flex is 0 or the value of the first finger being able to extend is 0, and as long as the 'master' values of the first, second, third and fourth fingers are all 0, make f2 the same as f1 – so read f2 as bytes 9 and 10 instead of 11 and 12. This effectively makes the first finger mimic the actions of the thumb as the first finger will now use the thumb's bytes to get its values. This is then repeated across all fingers and for all fingers/thumb selected by the user.

Short video of the functioning hand in action:
http://www.youtube.com/watch?v=-Ax0MjiS7Dk

02/10/2010 12:09
mahoon 
02/10/2010 12:09
mahoon 
Re: Looking for your input.

And to jimh:
Yes, there's some exciting stuff happening in labs. Give them another 20 years and we might even see the fruits of their labours...

03/18/2010 15:02
mahoon 
03/18/2010 15:02
mahoon 
Re: Looking for your input.

3.3.10
Have created a simple 3D environment with a Direct X object (a chair) in it. Am now considering using a cross-hair for the user to move around the room to highlight objects to interact with. I'm hoping to reuse the Raycasting routines from a well-known Dark Basic plugin called Sparky's collision.

18.03.10
Managed to integrate the hand with the 3D environment and have the raycasting function from Sparky's implemented. New video of the system at:
http://www.youtube.com/watch?v=YxT7VCXNJ_M

Note* I wasn't wearing the glove at the time of the recording (the glove makes it hard to type in code on the keyboard) so there's not too much control going on – the idea was just to demonstrate the principle. Will upload a better video when there are more objects to interact with in the 3D world.

Have approached a chap who's written his own application that controls lights and power plugs using the X10 system (a specially adapted plug is inserted into the regula mains power supply, into which you plug your light, kettle, microwave, whatever. A module is then plugged into the wall and attached to your compu through the COM port or USB port). Messages can then be sent from your computer to the module and then round the main electricity supply to the socket that has the adapted plug which then turns on the appliance attached to it. Hoping this guy can point me towards some simple COMMS software that the X10 system uses.

03/19/2010 17:11
mahoon 
03/19/2010 17:11
mahoon 
Re: Looking for your input.

19.03.10
Have made a simple object (cube) and added it to the environment. Might have to abandon the X10 idea as can't find software that wasn't written in .NET. The guy I mentioned above – Scott Hather, turned out to be very helpful and encouraging, however no joy as yet because lthough there's a conversion utility I found to convert .NET DLLs to ones usable by Dark Basic (involved putting a wrapper around the functions in the dll) the technicalities are beyond me at the moment. Instead have decided to simply try and interact with objects in the environment to start with. Can control the crosshairs using the glove and can now pick up the box when the crosshair is on it and the hand 'grips'. However it doesn't seem to stay gripped for very long as I can't move it very far. Not quite sure why that should be happening.
Found that if I update the position of the object that's being grabbed it grabs better but zooms towards the user. I think this is because the object when picked up puts its center at the crosshair position – the crosshair position then moves nearer the user because it's on the outside of the grabbed object and then the object moves again when it's updated to reflect the new position of the crosshair. - meaning the object rushes towards the user because the raycast is from the user to any objec in the environment. At least I think that's what was happening. Whatever it was, it was unsatisfactory. Have instead made a very small sphere object that is now the root of the crosshair object, which is now just a limb attached to the small sphere. This means I can pick up an object by 'glueing' it to the crosshair limb – this means it's position will be almost the same as the crosshair's and therefore won't rush towards the user when it's gripped. It seems to work in that I can now pick up and move objects around.

New youtube video at:
http://www.youtube.com/watch?v=ahiZKh8h8cM

 1 2
 1 2
project   manufacturers-   compensated   non-functional   environment   accelerometers   function   movecomphand   self-calibration   profile   compensation   Looking   software   programming-literate   fingers   crosshair   non-traditional   calibration   variable   electomechanical