First International Workshop on Testing The Cloud

co-located with ISSTA 2013, Lugano, Switzerland

CALL FOR PAPERS

About

Cloud computing is everywhere, inevitable: originally a layered abstraction of an heterogeneous environment, it has become the paradigm of a large-scale data-oriented system. And while it has some interesting features (easy deployment of applications, resiliency, security, performance, scalability, elasticity, etc.), testing its robustness and its reliability is a major challenge. The Cloud is an intricate collection of interconnected and virtualised computers, connected services, complex service-level agreements. From a testing perspective, the Cloud is then a complex composition of complex systems, and one can wonder whether anything like a global testing is possible? But if the answer is no, what can we conclude from partial tests? The question of testing this large, network-based, dynamic, composition of computers, virtual machines, servers, services, SLAs, seems particularly difficult. And critical for Cloud vendors: customers’ trust is indeed crucial for companies implementing Clouds, and they have to ensure that the system has all the security and performance characteristics the marketing department highlights. This problem is a perfect example of cross concerns between academia and product companies, and it covers a broad range of topics, from software development to code analysis, performance monitoring to formal model for system testing, and so on.

In TTC we aim at bringing together researchers and practitioners interested in this difficult question of testing the cloud, ie. a complex distributed, dynamic and interconnected system. Hence we call for regular scientific submissions, but also for industrial experience feedbacks.

Topics of Interest

"Testing the Cloud" covers many different topics, much more than the list we wrapped up below. So we welcome academic and industrial contributions that sound relevant - whatever is the background of the authors. In particular, we will run regular academic sessions, but we are also likely to have a more industry-focused session where it will be possible to describe solutions deployed in product companies or best practices followed by practitioners.

  • Domain-specific languages for testing
  • Fault injection
  • Formal specification and verification of programming libraries and programs
  • Functional and structural testing
  • New tools for testing
  • Performance testing
  • Programming techniques and methodologies (that decrease the need for testing)
  • Replay techniques for multi-threaded applications
  • Static and dynamic program analysis (including code review)
  • Test generation algorithms and tools
  • Security testing of concurrent systems and applications in the cloud
  • Load Testing
  • Live-testing
  • Test environment/Production environment
  • Test Monitoring
  • Test and customer relationship management