Live video feed of hands

Synthesia is a living project. You can help by sharing your ideas.
Search the forum before posting your idea. :D

No explicit, hateful, or hurtful language. Nothing illegal.
Nicholas
Posts: 12056

Post by Nicholas » 04-29-11 12:48 am

I don't see the price ever going that high. That's a different market segment than I'm interested in. I like the $25 crowd. ;)

If I'm understanding your mock-up, you're looking for real-time finger detection in the live video feed? It would be really cool to be able to place markers on the fingers like that so you could see exactly where they needed to go.

Incredibly hard problem. Something I've been thinking about for years though. (There are some ancient forum posts to that effect someplace.)

Not only is video lag your enemy there, but your detection and tracking has to also be done almost instantaneously too. Add to that how many wacky shapes you can get your hand into where one finger is covered by another, or how at that particular camera angle you can't quite see things perfectly, or there isn't enough light in the room to get a good clean signal...

Still, it's sort of the holy grail, isn't it? :D

I would love to do that.

Pianotehead
Posts: 319

Post by Pianotehead » 04-29-11 11:54 am

Nicholas wrote:...It would be really cool to be able to place markers on the fingers like that so you could see exactly where they needed to go.

Incredibly hard problem.
That's what I thought. Been learning C# and ASP.NET, took a few months course at a tutorial center, still in relatively simple programming. So I can imagine something like this must be complicated.

Lemo
Posts: 313

Post by Lemo » 04-29-11 12:11 pm

Check this one, just like Kasper's mock-up ;)
Image
That's some AR stuff I found some weeks ago
http://ilab.cs.ucsb.edu/projects/taehee ... ndyAR.html

I'm not sure it would help video lag indeed,
Let's see how synthesia looks like with just the upcoming fingering numbers at first :roll:
Stuff & experiments for Synthesia: Gramp v0.2SkinboxFireSynthVideoWebradio

aria1121
Posts: 1502

Post by aria1121 » 04-29-11 1:15 pm

Seems like there needs some hand-detection-calibration-thing.
The same for the piano, 'where are the keys?'.

Nicholas
Posts: 12056

Post by Nicholas » 04-29-11 5:23 pm

That is very cool. Still a much simpler version of the problem (palm open, no gestures, no weird hand configurations), but it's definitely a start.

The source link on the page is dead, but I just contacted one of the authors hoping he could point me in the right direction. Here's to hoping.

TonE
Synthesia Donor
Posts: 1180

Post by TonE » 05-06-11 3:06 am

Maybe this YouTube video is also interesting:
http://www.youtube.com/watch?v=5nZSVjY_gNo

What you need is only a 100 Euro Microsoft Kinect sensor.

Nicholas
Posts: 12056

Post by Nicholas » 05-06-11 4:05 am

I've been looking into depth sensors since a year before the Kinect (or Project Natal, for that matter) was announced.

I haven't gotten a chance to play around with it yet, but it is an incredibly low-cost way to get a really awesome sensor. Placement is still tricky. To get it to see an entire keyboard, you'd have to have it near your ceiling.

TonE
Synthesia Donor
Posts: 1180

Post by TonE » 05-06-11 4:20 am

Nicholas wrote:Placement is still tricky. To get it to see an entire keyboard, you'd have to have it near your ceiling.
Well, yes, this might be the requirement for the correct view direction of the sensor for the normal piano keyboard playing, meaning hanging above the piano keyboard. Using another Kinect for the side view would give you even 6D data, which might help reducing algorithms cpu use. :)

Nicholas
Posts: 12056

Post by Nicholas » 05-06-11 5:03 am

Then you run into the problem of finding people that have two Kinect sensors. Ideally the feature would be something a large number of people would be able to use. ;)

TonE
Synthesia Donor
Posts: 1180

Post by TonE » 05-06-11 5:08 am

Ok, then only one Kinect as standard, two Kinects as bonus version for those who show interest in possible improvements.


User avatar
DC64
Posts: 830
Location: Earth, U.S.

Post by DC64 » 05-06-11 10:40 pm

So if this goes (and probably will) how long will this take to finish?
Or even produce?
"And now for something completely different."

User avatar
swalker133
Posts: 246
Location: @iosmusician

Post by swalker133 » 07-09-11 1:39 am

Any new development news on this feature?

just politely asking :)
Learning, creating, recording, and performing music on the iPhone and iPad...
http://www.iosmusician.com/

Nicholas
Posts: 12056

Post by Nicholas » 07-09-11 1:48 am

It still has a ways to go on the voting list (though it was a late addition, so it's kind of behind the rest of the features there).

Otherwise, over the last release or two (and even more in the next couple) I've been slowly wrangling various bits of code here-and-there into place to get ready for it.

Right now I have the iPad port prioritized higher though, so it'll be after that, for better or worse.

User avatar
swalker133
Posts: 246
Location: @iosmusician

Post by swalker133 » 11-11-11 7:28 pm

Okay, i think i've got a pretty good idea for how this could work...

lots of people now a days have iPhones right? what if you made an iPhone App that would act as a web cam. It wouldn't send the video over WiFi, like a Skype/Facetime call, it could be pluged into your computer and transmit it via USB... If you made a app for the App Store, it could promote not only Synthesia, but using the live video feed feature in synthesia.

Of course, the question comes up: how would i get my iphone positioned directly over my keyboard... I could make it work... I have a metal gated shelf right above my computer. I could rest my iphone on top of it and place it just so that the camera is inbetween the bars of the shelf... If that description doesn't make sense, dont bother trying to figure it out, all im saying is that I can make it work... Could you?

Yes or no?

maybe we should take a poll to see how many people could make this work to see if it's a good idea...
Learning, creating, recording, and performing music on the iPhone and iPad...
http://www.iosmusician.com/

Lemo
Posts: 313

Post by Lemo » 11-11-11 8:05 pm

You can already do that with some random iCrap,
if you like to pay to use hardware you already have
I'll just use my webcam.
Stuff & experiments for Synthesia: Gramp v0.2SkinboxFireSynthVideoWebradio

User avatar
swalker133
Posts: 246
Location: @iosmusician

Post by swalker133 » 11-11-11 8:11 pm

what if you dont have a webcam but you do have an iphone? (i do)
and i doubt that app integrates with perfectly with synthesia... ;)
Learning, creating, recording, and performing music on the iPhone and iPad...
http://www.iosmusician.com/

aria1121
Posts: 1502

Post by aria1121 » 11-12-11 9:33 am

@ swalker133
So you want to use your iPhone as webcam? If Synthesia comes out on iPad will you still use your iPhone as webcam? Like duct-taping it on the ceiling; isn't that a bit too far for a phone? :)

User avatar
jimhenry
Posts: 1759
Location: Southern California

Post by jimhenry » 11-12-11 5:37 pm

Maybe this would be a manageable way to provide hints with a live keyboard feed.
live video.jpg
live video.jpg (55.91 KiB) Viewed 10759 times
You could use a view of the keyboard with no hands taken before you start playing and generate polygons for each key and transform the keyboard image and polygons to match the back edge of the keys with the falling note display. Then you could fill the polygons with colors as the notes fall to create a virtual lighted keyboard.
Jim Henry
Author of the Miditzer, a free virtual theatre pipe organ
http://www.VirtualOrgan.com/

User avatar
swalker133
Posts: 246
Location: @iosmusician

Post by swalker133 » 11-12-11 5:47 pm

@aria1121

Yeah, that's true... After posting it i realize it was a pretty stupid idea... :lol:
Learning, creating, recording, and performing music on the iPhone and iPad...
http://www.iosmusician.com/

Post Reply