ResearchBib Share Your Research, Maximize Your Social Impacts
Sign for Notice Everyday Sign up >> Login

IT Confidence 2013 - 1st International Conference on IT Data collection, Analysis and Benchmarking

Date2013-10-03

Deadline2013-06-24

VenueRio de Janeiro, Brazil Brazil

Keywords

Websitehttp://itconfidence2013.wordpress.com/

Topics/Call fo Papers

Software intensive organizations (SIOs) have been aware of the significance of measurement processes for informed decision-making and for managing software processes, products and projects. Measurement represents an important enabler for software, systems and service process improvements. But when starting an improvement program, SIOs not always have historical data at their disposal and it can significantly increase time, effort and costs for obtaining a positive return on investment (ROI) from such measurement and monitoring activities.
Paradoxically, a cooking book is an historical database of ‘recipes’ much more robust than those from many SIOs, containing ingredients (resources), steps (process), wines to accompany the meals (non-functional factors), cooking difficulty levels (risks), all leading to the overall cost to be paid, as the summation of materials and labor costs. Looking at most organizations’ historical repositories, too often such level of detail is not reached and experience still represents the typical driver to estimate effort and costs for new/changed projects and activities.
Data collection, analysis and benchmarking are the key for solving this initial problem and for maximizing the returned value for projects along their lifetime by:
? Periodical revision of thresholds to any KPI (Key Performance Indicator) established within measurement and/or project plans, looking for a proper composition with few but balanced measures onto a Balanced Scorecard (BSC). In fact, a balanced measurement program can gain continual top management support only if its benefits are utilized at the strategic level as well as throughout the organization instead of looking only (or mostly) at the short-mid term.
? Helping in properly classifying and distinguishing ‘apples and oranges’: another common issue is the ‘de facto’ strong separation about what is ‘software’ or ‘services’ with their related quantitative measures: a maintenance activity is ‘service’ and should be properly classified applying a detailed taxonomy, e.g. such as the one proposed by the ISO/IEC/IEEE 14764:2006 standard.
? Assessing the reality value of a supplier’s proposal for a software project, minimizing the risk of cost overruns and schedule slippage
? Software Cost Engineering: project estimation based on parametric models and historical data typically results in more accurate estimates and therefore in more successful projects.
Drawing on the ISBSG’s 15-year experience, this conference will provide a forum for both practitioners and researchers to discuss the most recent advances in data collection, analysis, standardization, and benchmarking, ways to reduce the resistance to data collection within SIOs for generating value from the creation, evolution, and maintenance of project/organizational repositories able to increase what ITIL calls the ‘DIKW’ four waves (Data ?> Information ?> Knowledge ?> Wisdom).
We invite professionals responsible for, involved in, or interested in data collection, analysis, standardization, and benchmarking to share innovative ideas, experiences, and concerns within this scope. The conference targets two types of contributions: (1) Experience contributions: e.g. problem statements and practiced solutions in planning and using benchmarking practices for helping and driving strategic decision-making, and (2) Research contributions: e.g. empirical studies, hypothesized models, and suggestions for establishing effective benchmarking programs for improving software/service value management.
Conference topics include but are not limited to:
? Approaches for cost-effective and sustainable benchmarking programs
? Approaches for standardized data collection, analysis, and standardization practices
? Innovative Software Cost Engineering approaches, using parametric models and historical data
? Requirements and constraints for designing effective benchmarking programs to achieve a better value management
? Benchmarking in SMEs (Small-Medium Enterprises) and VSE (Very Small Entities) and related constraints
? Internal and external benchmarking for better strategic, tactical and operational decision-making
? Empirical studies on what could be the most effective measures/KPI to be considered in a benchmarking program and their level of granularity for properly supporting the decision-making process
? Alignment of business, tactical and operational goals and timing for revising the benchmarking strategy
The paper submission system and further instructions regarding deadlines are available at the submission page (https://www.easychair.org/conferences/?conf=itconf...). Both presentations as papers (with a related presentation) will be accepted. Presentations must not be longer than 25 slides, including no commercials (maximum a single slide presenting your own organization), while papers must be not longer than.10 pages. The abstract can be written using the paper template above mentioned, and should not be longer than 2 pages. Templates are available clicking here.
Conference languages are English and Portuguese.
Important Dates
? Abstracts submission: April 15, 2013
? Full presentation/paper submission: June 24, 2013
? Notification of acceptance/rejection: July 15, 2013
? Final presentation/paper version: July 29, 2013

Last modified: 2013-03-01 21:53:36