On October 12th, I attended the 30th edition of Toronto DemoCamp, and saw four demos from local startups.
Upverter is an online electronic design tools, using HTML5, Javascript and Google libraries to provide a drawing canvas for electrical engineers. With about 40,000 lines of code, it provides pretty complex functionality, and they are hoping to displace $100K enterprise tools. They are seeing some enterprise adoption, but are pushing in the university and college space to provide free tools for EE students doing circuit design, who presumably will then take that knowledge into their future places of employment. They have realtime design collaboration designed into it, which will be released in the next few weeks, and already allow for some collaboration and reuse of common components. They also integrate with manufacturers and distributors, providing both components catalogues as input to the design, and “print to order/make” on the completion of the design.
Vidyard is a video player for corporate websites, intended to avoid the problems of native YouTube embedding, including that of corporate networks that block YouTube content. They provide customization of the video player, SEO and analytics, including analytics from the cross-posted video on YouTube. For me, the most interesting part was that they built this in 16 weeks, and fully embraced the idea that if you’re a startup, you can do it faster.
Blu Trumpet is an advertising platform based on application discovery, providing an SDK for an app explorer to be embedded in a publisher’s app to display a list of “related” or partner apps, and redirect to the App Store.
Maide Control was the most exciting demo for me that evening, mostly because it turned my preconceived notion of how a gadget is supposed to be used on its head: they allow you to use your iPad as an input controller for 3D navigation, rather than for consumption of information. In other words, you don’t see the model on your iPad, you see it on the native application on your computer, while your iPad is the touch-based input device that does gesture recognition and translates it to the application.
That’s not to say that you’ll give up using your iPad for consumption, but that you’ll extend your use of it by providing a completely new mode of functionality during an activity (navigating a 3D space such as a building model) when your iPad is probably currently languishing in a drawer. They gave a demo of using an iPad to navigate a 3D city model on SketchUp, taking full advantage of multi-touch capabilities to zoom and reorient the model. When I saw this, I immediately thought of Ross Brown and his 3D process models (BPMVE); even for 2D models, the idea of a handheld touchpad for navigating a model when displaying during a group presentation is definitely compelling. Add the ability for multiple iPads to interface simultaneously, and you have a recipe for in-person group model collaboration that could be awesome.
They also showed the ability to use the iPad and a mouse simultaneously for controlling the view and drawing simultaneously; for impatient, ambidextrous people like me, that’s a dream come true. They have to build interfaces to each specific application, such as what they have already done with SketchUp, but I can imagine a huge market for this with Autodesk’s products, and a somewhat smaller market for 2D Visio model manipulation.
Disappointingly, Kobo didn’t show in spite of being on the schedule; it was probably just a week too early to give us a sneak peek at their new gadget.