"There�s An App for That" -- but how do we actually develop them? While smartphones and tablets are even getting increasingly more popular and their application scenarios are growing, we still develop them using only a standard integrated development environment. As context-based services and apps do, next to network connectivity, require lots of sensor data, the tools for providing realistic sensor data during development are still immature.
Developing, testing, debugging and evaluating those next-generation context-based apps require sensor data for the mobile device -- acceleration, motion, light, sound, camera and many more sensors are available. Though, the existing development tools do seriously limit application developers by not providing the data at all or only on a very limited scale. Especially for indoor environment with applications such as indoor navigation, seamless interaction between public and private displays and activity recognition and monitoring, realistic sensor data are needed and simulation support during the development phase is essential.
In this paper, we present our work towards a holistic approach for mobile application development in intelligent environments, leveraging the existing development tool chain, facilitating more effective and realistic means for mobile application development at the example of the Android mobile device platform.