Well folks, this final blog pretty much wraps up my entry into the Ultimate Coder Challenge, and I hope you found it interesting and inspiring. The last seven weeks have seen some very late nights, some triumphs and a fair few setbacks on a quest to build a Perceptual Computing app.
I hope this diary has helped and will help other coders who are considering entering the world of Perceptual coding. It is an exciting new technology to get involved in and I invite anyone with a sense of adventure to give it a go.
The Final App
By the grace of Intel I have a few more days before I submit my final final version of the app to the judges, but I wanted to take the opportunity to release an almost final version today as part of my final blog.
If you would like to download and try it out with a friend, and let me know if anything horrible occurs, it will help me produce a solid final final version when the time comes. I will be testing internally of course, but there is no substitute for external testing and an app like this needs a lot of field testing!
Instructions
The download is a complete installer which will install DirectX, the Perceptual SDK and of course the app itself. Once complete, you can launch the app from the installed desktop icon. If you have already installed the Perceptual SDK, you can click Cancel to skip that part of the installer. You may need to reset your PC/Ultrabook after installing the SDK before the camera springs to life!
If you are lucky enough to own a Gesture Camera, select or say the word CAMERA to see yourself as a virtual 3D avatar and see what happens when you move your finger really close to the lens! Everyone else is welcome to try the CALL feature which will connect two on-line users in a virtual conference call, including the ability to sketch to the screen. To clear the sketch, just press the SPACE BAR.
I have added built-in video help in the form of the ASSISTANCE button and to exit the app entirely touch or speak the word TERMINATE.
Firewalls and Routers
Setting up your PC to allow the software to communicate over the network is potentially the trickiest part of this app. It's a two step process. First you must configure your Router/Cable/ADSL modem to allow the app to both send a receive data. You do this by accessing your Port Forwarding settings (on my BT HomeHub the address is 192.168.1.254) and ensuring both TCP and UDP data can travel both ways through port 15432, and all such traffic goes to the local IP address of the PC that will make or receive calls.
The second task is to configure any firewalls you have in place along the same lines, ensuring the local app (installed by default to your Documents folder) permits two way TCP and UDP communication on all ports. At your own risk, you can optionally disable the firewall while you run the app should firewall configuration prove uncooperative.
The app is designed to work over the internet by default, but if you want to use the app over your local LAN network, you can locate the app folder and rename "_ipoverride.txt" to "ipoverride.txt" and amend the contents to the local IP address for the PC you have installed the app onto. Do the same with the app on another PC in the network and you will then be able to discover registered users locally. Alternatively, you can simply enter the IP address manually from the CALL screen.
The Last Seven Days
Most of the work since the last blog has been focused on improving the speed of the app, the audio visual synchronisation and the front-end GUI polish. For some reason my new GUI has been inspired by Chocolate and on a fast Ultrabook the app can reach 60 fps now thanks to some last minute jiggery-pokery.
I also invented a new tracker :) Instead of head, eye, gaze, hand, tongue and foot tracker, I created a 'body mass' tracker. That is, the camera figures out where most of the 'mass' exists within shot and provides a coordinate to reflect this. The advantage over a head tracker is that individual hands and other protrusions don't affect the track, meaning when you lean left or right you are almost certainly going to get the desired effect which in the case of the app is to move the camera position so you can see left and right along the table.
I could not resist adding a little spark to the new CAMERA mode. Now when you wave your finger in front of the camera, a green spark is emitted. When you move the finger closer to the camera, it blasts out hot red sparks.
It is the pre-cursor to an instant-feedback gesture system that I have a few ideas for, but perhaps this is something for another day :) It was fun to see when I added it though, so I kept it in for you.
The Full Source Code
For coding fans, find below a link to the entire DBP and CPP source for the app which are not pretty, efficient or elegant, but they are honest and readable, and a great place to cut and paste from:
http://videochat.thegamecreators.com/PerceptuCam.dba
http://videochat.thegamecreators.com/PerceptuCam.cpp
If there is enough demand, I will formalise the Perceptual Computing code into official DBP commands, and perhaps even migrate them to AGK as well.
That's All Folks!
I'd like to extend my thanks to Bob and Wendy for their diligence and professionalism, making this competition a smooth and enjoyable experience. Big hugs to all the judges for your encouragement and of course for putting up with my occasionally rambling blog posts. I would also like to tip my hat to the six teams, comrades who made the competition feel more like a chilled out V.I.P Hackathon. I've learned so much from you guys these past few weeks and I'm humbled to have been part of this select group.
Mushy stuff over with, I'll finish by saying goodbye and I look forward to dismantling more cutting edge technology for you in the very near future.