After a suggestion by my tutor yesterday, I considered the possibility of using Quicktime VR for creating the environment for the planets. In that case, this Quicktime VR environment would be the application itself.
It looked like a promising system. From the start, there was a problem though: it is about still images, not videos, and I would like to use motion (I hadn’t clarified this before the suggestion was made). Additionally, I don’t know whether it would work with material of more abstract nature than clean photography.
Researching further into this, it appears that support for it isn’t as wide as expects. This link for instance suggests that there is no support for it on the iPad, which is made by the same company.. Also, the official web page for Quicktime VR at Apple’s website is dead (link). I started thinking that maybe it’s a technology which looked very promising a few years ago but didn’t take the world by storm and support for it is decreasing. Even if this is not true though, it seems to me that designing applications for mobile devices is a smarter and more future-proof decision.
After researching online for applications using panoramic pictures (with very few results) and software to create them, I downloaded two of them (software for creating panoramic photos) and tried them, to decide whether the results would be of use in a potential application. I used two random ‘Venus’ photos for now, just to test what the software actually does.
The first one I tried was ‘Hugin’ which I mentioned in the previous post. I wasn’t satisfied with the results.
I also tried ‘Arc Panorama’ (link) which was suggested by my tutor. This was more interesting, but still I do not see how the material could be used in a potential application. What it basically does, is stitch two or more photos together, and create some distortion in the perspective. The final outcome though is still a still image – in order to be able to navigate in a space using an application, this distortion in perspective needs to be changing real-time… It is something that needs 3-D space and the use of a virtual ‘camera’. It sounds simple but is actually very complicated – I do not have experience of designing in 3D space. Granted, by 3-D standards it is a very simple form, it is however a completely different environment. On top of that, I would need to render texture, work with a virtual ‘camera’.. This would be a project for collaborating with a 3-D animator or games developer – however, there is no time for that within this module.
After a suggestion by my tutor, I have begun looking at applications with a panoramic view and thinking whether there is a way this element could be incorporated into my applications.
Most of the material I’ve found are about applications similar to this (link):
There also some desktop applications that claim to create panoramic views by stitching together photographs, like this one (link).
Adobe Air is a very useful tool which can be used alongside Flash as you can build an application using the Flash Professional software and export it as an Adobe Air app, thus making it cross-platform.
There is a very interesting short introduction in video format on Adobe’s website: LINK
THIS LINK contains a 60-minute video which explains how to develop an application using Flash and export it for iOS mobile devices and for Android ones. It is however more than a year old, and latest advances (Adobe Flash CS 5.5) have made things easier.
A useful resource centre regarding the development of applications using Adobe Products can be found HERE.
THIS article explains how to export a Flash application for use on an iPhone or an iPad. It seems surprisingly straightforward. The main points are pasted below.
“In September 2010, Apple announced that it had lifted restrictions on its third-party developer guidelines. This means that you can now develop applications for iOS (iPhone and iPad) using the Adobe Flash Platform.
[..]Adobe Flash Professional CS5.5 lets you publish your ActionScript 3 projects to run as native apps on iOS. You will have access to nearly all the AIR 2.6 and Flash Player 10.2 APIs. For example, you can use APIs such as RTMP, Remote Shared Objects, and AMF as well as AIR APIs like SQLite and filesystem access.
Here’s how you would go about developing an iPhone app, for example. First, create your application on the desktop that fits the screen size of the iPhone. The iPhone’s display (like many smartphones) is 320 × 480. When the app is not in full-screen mode, 20 pixels are taken up by the status bar, so consider that when building your application.
Second, your finger is your pointing device. You can use mouse events (and touch events) to track the user’s intent, but remember that the finger is an inaccurate pointing device. Sometimes a finger goes down on the screen but moves up elsewhere. Certain behaviors that you may often employ in desktop application development will not necessarily apply to iOS devices.
The third and most important consideration when building your application is performance. Performance, performance, performance! The iPhone is most decidedly not a desktop computer. It has very powerful and sophisticated hardware, but there is a wide spectrum of capabilities between the different generations of device, the amount of memory available, and the amount of processing power your application has at its disposal.”
I revisited the web sites mentioned in an earlier post and examined the styles of games that would be suitable for the Solar System application. The first one is to use the concept of a puzzle. I do not think this is particularly interesting. However, it doesn’t need to be with rectangles like the example shown below. The kid would be required to place the exact elements in their position e.g. set the planets of the solar system in the correct order, or the elements of a planet.
The second style of gaming has to do with ‘shooting’ things (in this case balls). This sounds too simple but it is actually very effective, especially in the case of games for mobile phones.
‘Timez Attack’ is a particularly interesting case in my opinion, because it appears to be a fully 3D and without compromises on the gaming aspect, while claiming to be educational.
In April 2010, Steve Jobs wrote a lengthy blog hitting against Flash (link), in which he announced that Apple would not support it in its mobile devices and claimed that it is insecure, energy-hungry and causes Macs to crash. While there may be some truth to that, many claim that the real reason behind this decision was business oriented – especially the 30% commission that is paid to Apple for applications written in its native system. This situation seemingly brought the ‘death’ of Flash for future mobile applications before there is something new and reliable to replace it.
I decided to do some research to understand what the situation is right now and what is expected in the near future. A blog in PC Pro magazine (link), posted just two months ago, provided me with interesting insight on the matter and made me realise that things aren’t as ‘black and white’ as I thought.
Some selected bits from that article:
“The fundamental shift from Flash to HTML5 in the browser is unavoidable, and now even Adobe is fully and clearly on board. However while “doing Flash in HTML5” sounds simple and desirable, that doesn’t mean it is./..to produce rich Flash-style results you’re going to need a dedicated Flash-style tool for design and output. And the most likely provider will be Adobe. No doubt the next version of Dreamweaver will add canvas tag capabilities while for more complex scenarios you will be able to use the all-new, dedicated, HTML5-native Adobe Edge.
Alternatively, Adobe has made it clear that it plans to graft HTML5 output onto its existing Flash tools whenever that’s possible, so why not stick with what you know?
Most commentators are assuming that Adobe is effectively throwing in the towel when it comes to Flash for the mobile market, but again this is a mistake. Yes the Flash player has been ruled out, but, as I discuss in my current RWC column in the January edition of PC Pro, the Flash tools remain as relevant as ever.
..In particular it’s important to note that Adobe’s recent announcement says: ‘Our future work with Flash on mobile devices will be focused on enabling Flash developers to package native apps with Adobe AIR for all the major app stores’. Which makes it pretty clear that Adobe is planning to build on its existing Android and iOS native output with new support for Metro.
..It turns out (again) that the rumours of the death of Flash are greatly exaggerated in both the desktop and mobile arenas. In fact the technology and platform is arguably healthier and more relevant than it has ever been, just in the new guise of AIR.
..universal HTML5, native code and Flash in between.”
An idea suggested by my tutor, is to involve the use of an ‘interactive glove’ (if suitable) in some of the applications. An interactive glove is basically a glove with sensors, which can be used as an input device. It sounds a bit extreme in theory but in practise it makes perfect sense and shouldn’t be that difficult to make. As a matter of fact, according to this source (link), the first one was created by Laetitia Sonami in 1991 (and got a major upgrade in 1994). This is where the image below is taken from.
Researching further, I found that several people have tried similar things.
This article (link) mentions some of them.
This source (link) for instance talks of a ‘Zerkin Glove’.
A video with the project of another person can be seen here.
This video (link) shows an interactive glove being used with Google Earth.
Another interesting possibility is shown in this video (link), with a glove designed for the Wii.
In my opinion though the most promising piece of information is the fact that Google has embraced the technology and might release something in the future (link). The idea has existed for so many years, yet it seems that it’s only when a powerful company adopts it that things begin to really ‘happen’..