Project page can be found here
The First Hackathon
First things first, I've never done a hackathon. In university, I've always been too busy lose a whole weekend and ruin my energy for the week. And outside of university, I didn't know anyone willing. But then I was invited back to my old university to work on a project with a friend. So that meant getting my hands dirty. Let's go!
My apartment is littered with IoT. Some DIY. Some Consumer. All of it still very far from being the future we in-vision in science fiction and our vision of a seamless home. The promise of our hack was: point at your device and activate it, don't think about any apps or voice assistants. A hefty promise for 36 hours.
36 Hours Remain
When the venue doors opened, Jackson and I weren't anywhere near there. We had already checked in and left to go eat by our hotel. Sitting there in a restaurant booth, I pitch our initial architecture. A python backend sending and receiving messages on a RabbitMQ instance. And the devices listen on the RabbitMQ queues for controller actions performed on them and respond accordingly. And a React Native app to update locations of the devices, which is also just a message to the backend.
After a midnight dinner, we head to the venue to meet with our other team mates. We meetup in a classroom and start drawing out our architecture and writing Trello tickets. Organization is key of course. 1am turned to 3am which turned to 5am. It's sleep time. I wasn't going to limit my team mates sleep before the tires hit the road.
20 Hours Remain
It's morning. Well I guess afternoon. It just past noon. 7 hours of sleep. Not bad. Shower and go time.
I brought a huge amount of resources to work with: a desktop, a bin of hardware, two sets of PC VR kits. (In depth Valve Index Review is still coming, check back soon!) All four of us helped get it inside to a place where we could work. Once we were setup, it's time to get to work. We made skeletons for our proof of concepts.
A React Native app was simple, nothing fancy to start. RabbitMQ instance was also fairly straight forward. And a Raspberry Pi that controls an LED is just plain simple. The backend is what Will and I were building. We got as far as python spitting out X, Y, Z, Pitch, Yaw, and Roll of our controllers. But that wouldn't be enough, we needed to build a way calculate if those coordinates are pointing to our physical items in space. And no one was going to do linear algebra to write the algorithms after 6 hours of work, and now we're hungry. Things are getting scary, If we can't get this backend working, we don't have a demo.
12 Hours Remain
I sit there in a dining common desperately reading through articles about the mathematics involved in raycasting in a 3 dimensional space. And all articles for VR development only ever mention Unity and Unreal. This is proving to be much harder than initially expect. I even was asking if we should just bail, because I didn't think we have any chance of having a demo. But Jackson tells me to relax and focus.
And then I on a whim asked Harry to start on a Unity implementation of the backend over Slack. We could work on that in parallel to the python one. I put away my phone and get back to finishing up my food.
During the walk back to the venue I thought that pivoting to Unity would take too much time, in retrospect, I was very pessimistic. Because when I came back to room, not even 15 minutes later, and Harry had a full crosshair based raycasting engine ready to demo.
Hope was once again refueled. Harry and Will left for dinner, and I went ahead and enabled Steam VR support to have raycasting from a controller and not the camera. And it only took half an hour. I didn't expect how easy Unity development would be to pick up. And once we implemented AMQP in Unity we could turn on the LED! Just by pointing! No headset required!
8 Hours Remain
We're almost there. And we've updated the app to send a message to update our Unity game object that represents our Raspberry Pi move to where the controller is during configuration. Now the whole team is polishing and trying to make at least another controllable object. Though with so little time, our attempts would prove fruitless. We need those 3 hours of sleep to demo with any elegance.
We got the perfect room for a demo, our conditions were perfect. And all of our judges were engaged and understood our project very well. I could explain the tech like I would be to a client. And with that, we became the finalists.
Being a finalist, we had to present during the closing ceremony, which was underwhelming because we could not fully demo so under such time limits. And such we were given 4th place. In hindsight, we feel like we had too much experience to have a fair competition with other students and that was scaled accordingly. And fair enough. We're still proud of what we built and celebrated on Sunday night before conking out entirely, just to wake up for work on Monday.
Source code will be available shortly, there were a number of hard coded configuration that need to be removed from git history, configuration wasn't an option in such little time. After that it will be available.