I was thinking along the lines of robots mimicking human hands, legs etc and/or the ability to learn from human movements and replicate them like emotion, facial expression, teaching the robot hand gestures etc.
Some of these are quite challenging. Brian Bagnall's book (Maximum LEGO NXT: Building Robots with Java Brains) has a chapter on Hands and Exoskeletons. It uses a data glove to capture gestures. You could certainly use this for hand gestures.
You could build a robot face. It needs quite a few motors - possibly servo motors - to show any realistic facial expressions. Capturing facial expressions is quite tricky however. I don't know if there is any computer vision software that could be run on a PC to do this. If there was, you could then send commands to the NXT face robot to replay the facial expression.
To capture leg movements, you would probably have to use wireless sensors, unless you used sensors like rotation sensors, the tachometer on a NXT motor, gyro sensors or acceleration sensors, with long cables to capture leg movements.
I have just seen an article on the Gadget Show on Channel Five in the UK that has shoes with a variety of sensors controlling an MP3 player. I think you can watch this on the Internet. You could do a NXT version of this. You would have a NXT mounted on a shoe with a variety of sensors, such as a touch sensor and an acceleration (tilt) sensor. You could use this to control anything over bluetooth with your foot.