Negative reinforcement: aiming to increase the likelihood of a behaviour occurring again through removing an unpleasant stimulus

Positive reinforcement: aiming to I crease the likelihood of a behaviour occurring again through the addition of a pleasant stimulus.

Fixed ratio schedule: a schedule of reinforcement whereby a reinforcement is given for a certain number of responses that is unchanging.

So it's "fixed", as in fixed into place, or unchanging, and it's a ratio, right? So for every 5 presses of a button, the mouse gets 1 pellet. 5:1 is the ratio.

Fixed interval: also unchanging, but the reinforcement is given after a a certain about of time elapses. So for every hour someone works, they get paid $20. The hourly rate doesn't change, but there's a reinforcer given based upon time.

Variable ratio: a ratio of reinforcement that is unpredictable and changing. It normally has an average number of responses IIRC. So, you might reinforce he mouse after only one button press, which would be a 1:1 ratio, but then you wait anther 99 button presses before the next reinforcement (99:1 ratio). Ultimately the average is 50:1 (one reinforcer per 50 responses ON AVERAGE). This sets it apart from a random ratio, which has no average.

Variable interval: unpredictable amount of time between reinforcers. Think of fishing. You might get a fish after 10 minutes, maybe after an hour. Maybe after 2 hours. The fish is the reinforcer, here, and it comes at a variable time.

Notice how I've grouped them.

It goes--

Unchanging:

Ratio

Time

Changing:

Ratio

Time

(FYI continuous reinforcement is reinforcing the behaviour every time it happens)

Continuos reinforcement is best for establishing a behaviour, but not good for maintaining it.

From here on, I'm not 100% sure, so I'll leave it to someone else rather than fuck you up for your exam.