Transparent Rational Decisions by Argumentation (TRaDAr)

Start Date: 2012-09-01

End Date: 2015-08-31


Argumentation provides a powerful mechanism for dealing with incomplete, possibly inconsistent information and for the resolution of conflicts and differences of opinion amongst different parties. Further, it is useful for justifying outcomes. Thus, argumentation can support several aspects of decision-making, either by individual entities performing critical thinking (needing to evaluate pros and cons of conflicting decisions) or by multiple entities dialectically engaged to come to mutually agreeable decisions (needing to assess the validity of information the entities become aware of and resolve conflicts), especially when decisions need to be transparently justified (e.g. in medicine).

more information

Because of its potential to support decision-making when transparently justifying decisions is essential, the use of argumentation has been considered in a number of settings, including medicine, law, e-procurement, e-business and design rationale in engineering. Potential users of existing argumentation-based decision-making methods are empowered by transparent methods, afforded by argumentation, but lack either means of formal evaluation sanctioning decisions as (individually or collectively) rational or a computational framework for supporting automation. The combination of these three features (transparency, rationality and computational tools for automation) is essential for argumentation-based decision-making to have a fruitful impact on applications. Indeed, for example, a medical practitioner would not find a “black-box” recommended decision useful, but he/she would also not trust a fully transparent, dialectically justified decision unless he/she were sure that this is the best one (rational). In addition, the plethora of information doctors need to take into account nowadays to make decisions requires automated support.

TRaDAr aims at providing methods and prototype systems for various kinds of argumentation-based (individual and collaborative) decision-making that generate automatically transparent, rational decisions, while developing case studies in smart electricity and e-health to inform and validate methods and systems. In this context, TRaDAr’s technical objectives are:

(O1) to provide novel argumentation-based formulations of decision problems for individual and collaborative decision-making;

(O2) to study formal properties of the formulations at (O1), sanctioning the rationality of decisions;

(O3) to provide real-world case studies in smart electricity and e-health for (individual and collaborative) decision-making, using the formulations at (O1) and demonstrating the importance of the properties at (O2) as well as the transparent nature of argumentation-based decision-making;

(O4) to define provably correct algorithms for the formulations at (O1), supporting rational and transparent (individual and collaborative) decision-making;

(O5) to implement prototype systems incorporating the computational methods at (O4), and use these systems to demonstrate the methodology at (O1-O2) for the case studies at (O3).

The project intends to develop novel techniques within an existing framework of computational argumentation, termed assumption-based argumentation, towards the achievements of these objectives, and adapting notions and techniques from classical (quantitative) decision theory and mechanism design in economics.

The envisaged TRaDAr’s methodology and systems will contribute to a sustainable society supported by the digital economy, and in particular they will support people in making informed choices. The project will focus on demonstrating the proposed techniques in specific case studies (smart electricity and e-health for breast cancer) in two chosen application areas (digital economy and e-health), but its outcomes could be far-reaching into other case studies (e.g. in other areas of medicine) as well as other sectors (e.g. in engineering, for supporting decisions on design choices).

more information