Each process can involve as many participants as you like – e.g. potentially 10s or 100s (even 1000s!) of participants. Depending on the activity (as below), participants can work individually or together in groups.
The following five decision activities are available. All are potentially useful for a complete decision process. But you can use fewer activities, depending on what you want to do – e.g. perhaps just a Decision Survey, or a Ranking Survey followed by a Decision Survey, or more, etc.
As well as 1000Minds surveys (as below), you can embed another online survey – e.g. as can be easily created from Google Docs – into your survey to collect participants’ socio-demographic data, for example (or other information of interest).
This survey asks participants to answer a series of questions involving trade-offs between pre-specified criteria.
Their answers reveal participants’ individual preference values (or ‘weights’), and also ‘on average’ for the group – representing the relative importance to them of the criteria they were asked about.
Participants’ answers to the questions, their preference values, and their rankings of entered alternatives and of all possible alternatives representable by the decision model used can also be compared.
Participants reveal their preference values as a group by voting on their decisions via the Internet and a teleconference.
Alternatively, they can use a decision-support centre, or bring their computers to a common location.
Participants’ decisions reveal their preference values as a group, and can be used to rank alternatives.
This survey asks participants to rank descriptions of hypothetical or real alternatives (e.g. case studies or ‘vignettes’) according to their intuitive judgements.
This reveals agreements and disagreements between participants with respect to their rankings of possible alternatives that might be considered. The results here are likely to support the need for a new decision-making or prioritisation method (e.g. via 1000Minds).
Participants can also work together as a group to rank the alternatives by consensus. This consensus ranking can be used later as a pseudo-gold standard to be compared against rankings from other steps in the decision process.
Also, by having participants discuss how they arrived at their rankings, appropriate criteria and levels for differentiating amongst alternatives can be teased out and then used to create a 1000Minds Decision Model.
This survey asks participants to categorise descriptions of hypothetical or real alternatives on pre-specified criteria.
This is useful for revealing agreements and disagreements between participants with respect to how alternatives are categorised. The results can highlight any issues about how criteria and levels are described, so they can be refined.
Participants can also work together as a group to categorise the alternatives by consensus, in the process further refining the criteria and levels.
This lets you compare participants’ rankings of alternatives from various other activities (as above).
This reveals similarities and differences between participants’ rankings of the alternatives from different activities in the decision process. This is useful for considering the face validity of the rankings.
For example, some users apply the consensus ranking from the ranking survey activity as a pseudo-‘gold standard’ to be compared against rankings from other activities.
We’re also able to provide process templates to guide participants through a proven series of activities: from agreeing to take part in the process through to producing a desired goal.