I worked with Carme on Tuesday and Thursday. She definitely understands that she needs to touch the screen, nay, she needs to touch a square for a reward to dispense. We don't know if she fully understands that she needs to touch the red square though. We will continue to train her so that she understands that needs to touch the red square for a food reward. We are also training her that her initial touch on the screen is her discrimination task "choice". Actually, let me back up a little. Carme usually stations in front of SMARTA and would put one of her hand on SMARTA as support and then the other to make her discrimination task "choice". As you can imagine, this pose a problem when her hand is on the screen; she would immediately get a trial correct or incorrect because her hand is on where the stimulus will appear. We wrapped up with her on Thursday and are quite confident that it'll take a few more sessions with her before she finally understands her task.
|
Carme using her hand to support herself on the SMARTA when doing discrimination tasks. |
I only got to work with Pyxis on Wednesday. This is her first session with SMARTA so what we did was reinforced her when she gets close to SMARTA to investigate or when she looks at the screen. We interspersedly alternate a blank screen and a screen that is fully red to elicit attention from her. We do this to all the lemurs we train because we want them to know that this black box has a screen that will display stimuli. How easy this information is processed by the lemurs is debatable but one thing is for sure, using a well timed positive reinforcement to guide them through a task is imperative.
Oh I have a new part time research assistant and my full time research assistant will be here in March. Happy days!
No comments:
Post a Comment